+ All Categories
Home > Documents > 2018 EPP Annual Report - MSU Billings

2018 EPP Annual Report - MSU Billings

Date post: 16-Oct-2021
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
6
Section 1. AIMS Profile After reviewing and/or updating the Educator Preparation Provider's (EPP's) profile in AIMS, check the box to indicate that the information available is accurate. Section 2. Program Completers 2.1 How many candidates completed programs that prepared them to work in preschool through grade 12 settings during Academic Year 2016-2017 ? 2018 EPP Annual Report CAEP ID: 10033 AACTE SID: 1030 Institution: Montana State University-Billings Unit: College of Education 1.1 In AIMS, the following information is current and accurate... Agree Disagree 1.1.1 Contact person 1.1.2 EPP characteristics 1.1.3 Program listings Enter a numeric value for each textbox. 2.1.1 Number of completers in programs leading to initial teacher certification or licensure 1 123 2.1.2 Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.) 2 47 Total number of program completers 170 1 For a description of the scope for Initial-Licensure Programs, see Policy 3.01 in the Accreditation Policy Manual 2 For a description of the scope for Advanced-Level Programs, see Policy 3.02 in the Accreditation Policy Manual Section 3. Substantive Changes Have any of the following substantive changes occurred at your educator preparation provider or institution/organization during the 2016-2017 academic year? 3.1 Changes in the established mission or objectives of the institution/organization or the EPP No Change / Not Applicable 3.2 Any change in the legal status, form of control, or ownership of the EPP. No Change / Not Applicable 3.3 The addition of programs of study at a degree or credential level different from those that were offered when most recently accredited No Change / Not Applicable 3.4 The addition of courses or programs that represent a significant departure, in terms of either content or delivery, from those that were offered when most recently accredited No Change / Not Applicable 3.5 A contract with other providers for direct instructional services, including any teach-out agreements No Change / Not Applicable
Transcript
Section 1. AIMS Profile After reviewing and/or updating the Educator Preparation Provider's (EPP's) profile in AIMS, check the box to indicate that the information available is accurate.
Section 2. Program Completers 2.1 How many candidates completed programs that prepared them to work in preschool through grade 12 settings during Academic Year 2016-2017 ?
2018 EPP Annual Report CAEP ID: 10033 AACTE SID: 1030
Institution: Montana State University-Billings
Unit: College of Education
   
1.1 In AIMS, the following information is current and accurate...   Agree Disagree
1.1.1 Contact person
1.1.2 EPP characteristics
1.1.3 Program listings
Enter a numeric value for each textbox.  
2.1.1 Number of completers in programs leading to initial teacher certification or licensure1 123 
2.1.2 Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.)2
47 
 
1 For a description of the scope for Initial-Licensure Programs, see Policy 3.01 in the Accreditation Policy Manual 2 For a description of the scope for Advanced-Level Programs, see Policy 3.02 in the Accreditation Policy Manual
Section 3. Substantive Changes Have any of the following substantive changes occurred at your educator preparation provider or institution/organization during the 2016-2017 academic year?
3.1 Changes in the established mission or objectives of the institution/organization or the EPP
No Change / Not Applicable
3.2 Any change in the legal status, form of control, or ownership of the EPP.
No Change / Not Applicable
3.3 The addition of programs of study at a degree or credential level different from those that were offered when most recently accredited
No Change / Not Applicable
3.4 The addition of courses or programs that represent a significant departure, in terms of either content or delivery, from those that were offered when most recently accredited
No Change / Not Applicable
3.5 A contract with other providers for direct instructional services, including any teach-out agreements
No Change / Not Applicable
Unit Assessment System Overview and History 7
Conceptual Framework 9
Developmental Progression of Assessment 13
Decision Points for Assessing Candidate Quality 14
Acting on Data 15
Non-Academic Assessment 22
Appendix: Current Key Assessment Instruments 26
Dispositions
4
Mission, Vision, and Goals
Montana State University Billings, as part of the Montana University System, works within the purview of the system-wide public institutional mission of serving the post-secondary needs of the citizens of Montana. MSUB’s Core Purpose is “To assure that all members of the university community reach their individual potential.” The MSUB four-part mission is below.
MSU Billings provides a university experience characterized by:
· Excellent Teaching
· Intellectual, Cultural, Social and Economic Community Enhancement
The College of Education faculty and staff enacts their role in this mission by establishing the COE Vision and Mission Statements, by living the COE Educator Oath, and by reaching its goals, below.
COE Mission: The College of Education is dedicated to:
· Preparing compassionate, caring and committed professionals for schools in Montana and beyond;
· Conducting socially significant applied research to improve the human condition;
· Providing community service to improve the human condition;
· Providing graduate education and continuing education for career-long professional development.
COE Vision: The College of Education will be recognized as a regional leader in:
· Providing fully qualified educators who positively impact the academic achievement of each individual learner;
· Developing and modeling state-of-the-art educational practice;
· Sharing the scholarly work of faculty and students through active involvement in professional associations;
· Demonstrating ongoing program excellence through national, regional and state accreditation performance.
Ethical and supportive dispositions are an important, albeit non-academic, quality exhibited by good teachers. COE faculty and COE Completers live the Educator Oath—based upon the Montana Educator Code of Ethics—below.
· I dedicate myself to the life of an educator, nurturing others to fulfill their potential, live meaningful lives and fully participate in our society
· I dedicate myself to assuring excellent educational opportunities for all individuals
· I pledge to make the well-being of my students the fundamental value in my professional decisions and actions
· I recognize parents and community as integral to education and pledge to encourage their involvement
· I dedicate myself to teaching the virtues of honesty, respect, trust and courage
· I commit myself to continuing my own personal growth and professional development, for I must bear witness in my life to the ideals being encouraged in others
· In cooperation with my colleagues, I pledge to advance education professions
· I pledge myself to serving both educational and larger community needs
· I pledge to fulfill these professional responsibilities with diligence, integrity and dedication
The COE has established two types of goals, as explained below.
First, the COE has adopted goals for student learning, referred to as student learning outcomes (SLOs), for initial licensure programs that are based directly on the 2013 InTASC Standards as well as the Montana State Professional Education Preparation Program Standards (PEPPS). The SLOs are aligned with our conceptual framework as well as the InTASC and PEPPS Standards so that one set of data can shed light on multiple professional standards and learning expectations.
Second, the COE has articulated operational goals as defined in its strategic plans. These strategic plans exist as separate, yet related, documents that support the COE Assessment System. These plans, in combination with the COE Assessment System, constitute the MSUB COE Quality Assurance System as required by CAEP Standard 5.
· Clinical Practice and Partnerships Selected Improvement Plan
· Quality, Selectivity, and Recruitment Phase-In Plan
· Program Impact Phase-In Plan
Quality Assurance System
The MSUB College of Education (COE) Quality Assurance System (QAS) responds to a recent paradigm shift from a singular unit assessment system to an integrated system of quality assurance that includes operational plans and goals. The MSUB Unit Assessment System and its accompanying three Plans join together to create the QAS. Maintaining documentation of the QAS and directing the workload of the closely associated committee structure is the responsibility of a joint effort between the COE Assessment Coordinator, the COE Data Specialist, the COE Dean, and the faculty-led CAEP Standards Committees. These stakeholders work together with other MSUB Offices (e.g., Information Technology) and Clinical Educators (our P-20 Partners) to accomplish the Plans and monitor Pre-candidate and Candidate development according to the Unit Assessment System.
Quality Assurance System
STD 2
Visualizing the four parts of the COE Quality Assurance System: CAEP Standard 5
CAEP
Unit Assessment System Overview and History
The primary purpose of assessment for the Montana State University Billings (MSUB) College of Education (COE) is to assure the quality and continuous improvement of COE programs. The assessment system provides an integrated structure for the collection, aggregation, and sharing of data on candidates, programs and the unit. The system enables faculty to ensure that the unit is graduating highly qualified candidates; has in place excellent, continuously improving programs of study; hires and retains an outstanding faculty; and supports programs with unit operations that are of the highest quality.
In fulfilling its Mission and the COE Educators’ Oath, the COE articulated a reflective practice model as a guide for assessment of the EPP (NCATE Review, 1992). This model was reviewed and revised during the next five years (NCATE Review, 1997). The model underwent extensive revision and evolved into the Reflective Practice Conceptual Frameworks for Initial and Advanced Programs during AY 2000-2001 and 2001-2002 (NCATE Review, 2002). Based on INTASC and NBPST principles, the conceptual frameworks were cumbersome, with as many as 72 indicators at the initial licensure undergraduate level. The conceptual frameworks were filled with developmental outcomes and benchmarks, and they won the AACTE award for Excellence in Accreditation in 2002. As candidates developed portfolios and faculty reviewed candidates’ reflections on their portfolio artifacts, data gathered indicated that the frameworks were too detailed. Because the process for candidates, clinical educators, and COE faculty was so arduous, the review of artifacts became trivial, much of the data from the previous assessment system would not be considered reliable or valid by today’s CAEP Standards. The COE intention of scaffolding candidates’ understanding of their place in their overall development as reflective practitioners was not achieved. Candidates and faculty were losing sight of the forest (Reflective Practice Model) for the trees (the indicators’ minutiae).
The redesign of the assessment system was accomplished collaboratively with members of the COE faculty, four teachers/administrators from local P-12 schools, and two faculty from the College of Arts and Sciences, who represented faculty in charge of secondary education programs. These stakeholders, most of whom served on the unit’s Assessment Committee, met to complete the redesign.
In Fall 2005, the COE initiated a lengthy review of all conceptual frameworks that resulted in more simple, understandable, and user-friendly program outcomes. While maintaining a commitment to
development of a Reflective Practitioner, the conceptual frameworks were reduced to their underlying foundations. The Initial Conceptual Framework outcomes were re-envisioned as the 10 InTASC principles with Montana-specific modifications incorporated. At that time, the Advanced Conceptual Framework was also re-written as six indicators that align with NBPTS Standards and assume InTASC competence.
Because the InTASC Standards are emphasized in the conceptual framework, the framework was reviewed after the InTASC Standards were revised in 2011. No major changes were made in 2011; however, in 2015-16, the initial and advanced conceptual frameworks were again revisited.
Discussion on the framework for advanced programs was tabled due to CAEP’s Standards for advanced programs being in flux. Discussion on the framework for initial programs ensued. The updated InTASC Standards and the COE’s philosophical commitment to developing reflective practitioners both informed the revisiting of the initial licensure program framework, as did the ability to operationalize the InTASC Standards through using the four broad domains of Charlotte Danielson’s (1996, 2007) Enhancing Professional Practice: A Framework for Teaching. Now included in the framework for initial programs is language that shows our specific commitment to Montana’s Indian Education for All (IE4A) initiative as well as culturally responsive teaching in general. This language is expressed as an addendum to the InTASC Standards and labeled as the “COE Standard.”
In Fall 2016, a slogan was discussed to help express our framework to our stakeholders (especially students) and illustrate the cohesion across our programs. Although a slogan was not adopted, COE faculty continue to discuss the best way to set the tone for our developmentally-sensitive COE assessment system.
During this last update to our framework and to the assessment system, we decided to focus specifically on initial licensure programs as we await the Council for the Accreditation of Educator Preparation’s (CAEP’s) finalized expectations for advanced programs. What follows in this document is specifically an assessment system for initial licensure programs that will be updated in 2017-18 after we partake in collaborative work with our stakeholders to solidify our expectations of candidates in advanced licensure programs and programs that do not lead to licensure (master’s programs in school counseling and online instructional technology, for example).
Conceptual Framework
Domains 1 and 3 are triangulations of three InTASC Categories, while Domains 2 and 4 have direct correlations. Each Domain is assessed, and the results are reported for each program area by InTASC Category as is required by CAEP accreditation requirements.
MSUB COE Conceptual Framework
“Good” teaching is reflective teaching. Four domains are provided to frame reflection on practice. Each domain is supported by the student learning outcomes found in the InTASC Standards. A full description of each domain’s components can be found in the text of Danielson, C. (2011). Enhancing professional practice: A framework for teaching, on which the domains are based.
Domain 1 Planning and Preparation
Domain 2 Classroom Environment
InTASC Standard #7: Planning for Instruction
Domain 4 Professional Responsibility
InTASC Standard #10: Leadership & Collaboration
InTASC Standard #5: Application of Content
COE Standard:
InTASC Standard #6: Assessment
InTASC Standard #8: Instructional Strategies
The InTASC Standards’ language provides student learning outcome statements for COE programs. The Standards are grouped by conceptual framework domain above, and below they are grouped by InTASC Category. The InTASC Categories cut across the conceptual framework domains. Note that the COE has added its own standard to address culturally responsive teaching.
Standard Categories
Associated Domain
Domains 1 & 2
Domains 1 & 3
Domains 1 & 3
Domain 4
COE Standard
Domain 3
QAS Committee Structure
The model of the COE QAS committee structure on the next page shows the system’s responsible parties. The structure is made of four committees, with every faculty member in the COE serving on at least on committee. Committee roles are aligned with the CAEP Standards.
The Dean collaborates with the faculty as a member of the Standards 4 & 5 Committee, which consists of the chairs of Committees 1, 2, and 3 as well as the COE Professor of Assessment and Accreditation (i.e., the “Assessment Coordinator,” a full-time, tenure-track appointment with a 50% teaching load) and the COE Assessment Data Specialist (a full-time staff position). The Assessment Coordinator also serves on Committees 1, 2, and 3 as an ex officio member who provides guidance for each of the committees. As part of the COE Selected Improvement Plan for Clinical Practices and Partnerships, the COE is currently developing relationships with stakeholders by including them in its quality assurance system as partners, as needed. For example, as part of the Dean’s contribution to the Standards 4 & 5 Committee, the Dean attends a monthly higher education partnership meeting focused on developing a system-wide approach for measuring candidate impact on P12 students across the Montana University System’s Colleges of Education.
10
Focus on design, quality, and logistics of field experiences
STANDARD 1 CMTE
Content Knowledge & Pedagogy
STANDARD 3 CMTE
Focus on choice of
STANDARDS 4 & 5 CMTE
Focus on which
Focus on program completers’ impact on
P12 learning and on how the COE as a whole shows that it is using assessment data for continuous improvement
CHAIR
17
Unit Assessment System:
Assessment Life Cycle
COE Committees bring recommendations to COE faculty and admin, who collaborate on continuous improvement.
1: Articulate the Student Learning Outcome (SLO) and performance criteria.
Standard 1, 2, & 3 Committees develop assessments, and the Standards 4 & 5 Committee tests their reliability and validity.
P20 partners help to construct assessments and provide feedback to COE and clinical faculty.
2: Identify the best tool for measuring the performance criteria.
COE faculty and Clinical Educators conduct the assessments and provide data to the COE Assessment Data Specialist
The COE Assessment Coordinator partners with COE Committees to analyze and interpret results.
5. Use the results to improve the student learning experience.
4. Analyze data and interpret the
assessment results.
measurement and gather data.
The assessment life cycle above has been updated for Fall 2016 to show the inclusion of two new MSUB COE members: the assessment coordinator and the assessment data specialist. The data specialist is responsible for collecting data and maintaining the information in our database, while the assessment coordinator is responsible for helping the Committees analyze and act on data. Also new for Fall 2016 is the inclusion of P-20 partners as a central component of our assessment life cycle. While we have continuously partnered with P12 educators and other institutions of higher education in the past, most of our partnerships have not been formally systematized. With P-20 partners as the central component of the COE Unit Assessment System, the COE can maximize partner participation in the co-construction of assessments and the use of resulting data for continuous improvement.
Unit Assessment System:
Developmental Progression of Student Learning Outcomes Assessment
MSUB COE pre-candidates and candidates experience assessment in each course that they take; however, in the interest of being parsimonious with COE resources, not all course assignments are collected and analyzed as key assessments. Rather, multiple key assessments are analyzed at a series of Decision Points in order to determine the development of candidate quality at: (1) admission into the educator preparation program (i.e., into COE Candidacy), (2) admission to clinical practice, at
(3) program completion prior to recommendation for licensure, and (4) after completion while working as new teachers with P-12 students. As part of its Selected Improvement Plan, the COE plans to follow candidates two-to-five years after completing, in accordance with CAEP recommendations. (See the CAEP FAQ on CAEP Standard 4). A visual overview of the student learning outcomes assessment progression is below, with assessment details appearing in the chart on the following page.
Multiple Assessment Measures at Key Decision Points
Pre-candidates
2. Application and admission
Student Teachers
and
education preparation program
When to Assess
What to Assess
During EDU 220
Sophomore Disposition Evaluation
During Coursework in professional core, content core, and general education
Student learning outcomes of courses in content and professional knowledge
Calculated GPA (Elementary Majors’ content core coursework and Secondary/K12 Majors’ general education core coursework)
Upon application to EPP
When to Assess
What to Assess
Performance on SLOs in InTASC Standards, PEPPS, and culturally responsive teaching
Formative Junior Field Evaluations Summative Junior Field Evaluations
Modeling and application of technology standards
Educational Technology Project
Evidence of Professional Growth (Incl. Lesson Plans)
Attributes and dispositions beyond academic ability
Junior Field Dispositions Evaluation
Skill and commitment to college- and career-readiness standards
Common Core Position Statement
Student learning outcomes of courses in content and professional knowledge
Calculated GPA (All Candidates to be evaluated on overall GPA and both content and professional core GPAs at this point)
Prior to Student Teaching
Content Knowledge Pedagogical Knowledge
When to Assess
What to Assess
Performance on SLOs in InTASC Standards, PEPPS Standards, technology, and culturally responsive teaching
Formative Student Teaching Evaluations Summative Student Teaching Evaluations Assessment of Content Pedagogy
Evidence of impact on P-12
Evidence of Professional Growth (Incl. Lesson Plans)
Attributes and dispositions beyond academic ability
Student Teaching Dispositions Evaluation
Student learning outcomes of courses in content and professional knowledge
Calculated GPA (All Candidates to be evaluated on overall GPA, general education GPA, and both content and professional core GPAs at this point)
Decision Point
Completion
Program Completer Survey
Employment
Case Study—Billings Catholic Schools and Aaniiih Nakoda Tribal College
Post Completion
Unit Assessment System:
Acting on Data
A graphic organizer describing roles of the COE unit assessment system is presented on the following page; it shows the relationship between (a) the CAEP Standards, (b) the Committees who monitor progress toward the Standards, and (c) who and what are assessed by the COE faculty. In the first column, this model shows who is assessed and what about them is assessed. It shows who is responsible for conducting the assessment in the second column. The third column lists the party responsible for analyzing and acting on data for the purposes of quality assurance and continuous improvement. “Acting” means that this party presents a recommendation to the COE faculty body for program improvement, supporting their recommendation with valid and reliable assessment data. Recommendations may be written or presented orally during monthly COE Assessment and Accreditation Meetings. It is important to note the iterative, developmental process of our assessment system; no singular party or Committee “owns” assessment. Each Committee contributes to more than one assessed component.
During the academic year, the Committees meet individually at least once per semester (monthly over the past few years). The Committee Chair is assisted by support personnel (the Assessment Coordinator and the Assessment Data Specialist) so as to ensure a collaborative structure that feeds back into the loop of Quality Assurance. Because the Assessment Coordinator and Assessment Data Specialist serve on every Committee, there is a continuity of information flow between the Committees. This helps to avoid duplication of work and to keep each Committee focused on their role in the overall Quality Assurance System. Having support personnel means that Committees are able to spend more time focusing on the interpretation and use of instruments and data; Committee members do not have to spend time inputting, manipulating, and storing data. Committees members are able to switch Committees, although membership is generally similar from year-to-year.
The COE faculty hold COE Assessment and Accreditation Meetings at least once per semester. Meetings generally focus on a particular CAEP Standard or key assessment. In the fall of each year, faculty are required to return two weeks prior to the first day of class. During this time, a two-day assessment retreat is scheduled during which the Assessment Coordinator presents the results of the prior year’s assessment efforts, leading the faculty through a process of goal analysis and reflection. Committees are able to use this retreat to bring forth any plans or recommendations to the COE faculty, as is the case during the monthly COE Assessment and Accreditation Meeting.
Who and what is assessed? Who does the assessing? Who analyzes and acts on data?
Assessment of
Standard 3 Committee
Standard 3 Committee
Assessment of
Standard 3 Committee
Standard 3 Committee
Assessment of
Dept Rank & Tenure Cmte
Dept Rank & Tenure Cmte And Individual Faculty
Assessment of
Clinical Partnerships
Assessment of
Reliability and Validity Plan
1. All instructors of Professional Core Courses will complete Course Alignment Charts showing their alignment with the CAEP Standards, with InTASC, and with the Montana PEPPS. Face validity of assessments’ alignment with standards will be discussed. Overseen by Standard Committee 1 during AY2015-16; information complied by the Assessment Coordinator and presented and discussed during Fall Retreat 2016.
2. All key assessments will be mapped to the Professional Core Course Matrix, AY 2015-16. Overseen by Standard Committee 1 during AY2015-16; information complied by the Assessment Coordinator and presented and discussed during Fall Retreat 2016.
3. One program will complete Course Alignment Charts and review the breadth and depth of their curriculum in comparison to discipline-specific standards as well as the professional core standards. Overseen by Standard Committee 1 and the Reading Program Committee during AY2015-16; information complied by the Reading Program Leader and presented to the Assessment Coordinator, curricular changes based off this review were presented to and approved by the COE faculty Fall 2016.
4. Additional key assessments will be identified and piloted to expand the breadth and depth of the COE unit assessment system. To be overseen by Standard Committee 1 during AY2017-18; information complied by the Data Specialist and Assessment Coordinator will be presented and discussed during Spring 2017 and Fall Retreat 2017, at which time the instruments will either be adapted or adopted into the assessment system, and they will be subject to further reliability and validity testing. Reading Program to pilot new assessments in revised courses beginning 2017-18.
5. All key assessments, in-use or piloted, will be revised for 100% alignment with the CAEP rubric for EPP-submitted assessments. Best practices in rubric development will be reviewed and implemented (e.g., rubrics will align to one standard per criterion). Overseen by Standard Committee 1 and CAEP Standard 2 Committee during AY2017-18; information complied by the Assessment Coordinator and presented and discussed during Fall Retreat 2018.
6. All key assessments created by the COE will be subject to additional tests of reliability and validity. These tests will be: inter-rater reliability (between Cooperating Teachers, e.g.), inter-assessment reliability (triangulation of GPA, Assessment of Content Pedagogy, and Evidence of Professional Growth), and using Lawshe’s method of content validity by engaging program areas with P20 partners and experts. (Cross reference our Clinical Practice and Partnership SIP). Overseen by Standard 4 & 5 Committee, which provides guidance and technical assistance, in conjunction with the Standard 2 Committee, which organizes implementation of these tests with P20 Partners, the Assessment Coordinator, and the Data Specialist. The first program to pilot the testing of reliability and validity will be the Reading Program, in collaboration with their program advisory group, which will be formed in AY2016-17 and begin work in 2017-18.
7. COE faculty, clinical educators, completers, and candidates will be surveyed and/or asked to participate in focus groups for their perception of our assessments’ usefulness and the results will be used to help determine adaptations to be made. Overseen by the Standard 4 & 5 Committee, with planning beginning Spring 2017 and implementation coinciding with case study research at Billings Catholic Schools in 2017-18. This work is an extension of the Montana Council of Deans Ad Hoc Standard 4 Committee.
Alignment of Key Assessments
The Conceptual Framework informs all COE program student learning outcomes. Although Montana does not require programs to submit individual reports to Specialized Professional Associations, COE programs also base their programmatic requirements SPA Standards, such as those from the International Reading Association and The Council for Exceptional Children. The Montana State Professional Education Preparation Program Standards (PEPPS) also inform curricular development. All program outcomes are in keeping with the MSUB and College of Education Mission statements.
Alignment Between Unit Assessment System Professional Standards and the COE Conceptual Framework
2013 Revised InTASC Standards
Professional Responsibility
Every key assessment instrument utilized by the COE was designed to meet the InTASC Standards, providing cohesion amongst programs. At this current Fall 2016 updating of the Unit Assessment System, the focus is on initial preparation programs. Discussions around improving the Advanced Program portion of the Unit Assessment System will be included in the Fall 2017 semester. Advanced Programs remain under the previous version of the Assessment System until that time.
22
Each program has specific instruments used during field experiences and clinical practice that are directly relevant to Candidate and P-12 Student Learning as described in CAEP Standards 1, 2, 3, and 4. College-wide key assessments exist for measuring pre-candidate and candidate dispositions. (Note that assessment of Program Impact is a linked, yet separate set of goals and assessments, described by the Program Impact Phase-In Plan developed in cooperation with the Montana Council of Deans Ad Hoc Standard 4 Committee.)
Our key assessments have been informed by the now-obsolete NCATE Standards as well. Current CAEP requirements state that data should be disaggregated by program and by InTASC Category. Also, CAEP requires more evidence of breadth and depth of candidate progress than previously provided under NCATE. To accommodate our self-study of these changes during the 2015-16 AY, the newly hired Assessment Coordinator asked professional core course faculty to prepare Course Alignment Charts for professional core courses. These Charts were then aggregated into a matrix to show the current extent of key assessment evidence for the CAEP Standards. During the Fall 2016 COE Retreat, the Professional Core Matrix for Initial Preparation Programs was reviewed to help guide continuous improvement of the Unit Assessment System by updating the choice and alignment of key assessments.
Prior to this self-study, alignment of key assessments with InTASC Standards was as depicted on the chart below. The first four key assessments listed appear in the Appendix of this document.
Alignment Between Currently Used Key Assessments and InTASC Standards, as of Spring 2016
InTASC Standards
Summative Evaluation
X
X
X
As a part of this self-study, alignment between the individual rubric items for the key assessments will be reviewed for its congruency with the InTASC Standards. This work falls under the purview of the CAEP Standard 1 Committee. Upon completion of that review, alignment may be modified. The chart on this page will be used to document any changes to the alignment on the previous page.
Alignment Between Currently Used Key Assessments and Current InTASC Standards, as of Fall 2016
InTASC
Standards
Leadership & Collaboration
Upon review of the Course Alignment Charts, the Professional Core Matrix for Initial Preparation Programs, and the rubric alignment above, the alignment of any proposed new assessments will be documented in the chart below. These assessments will be refined and piloted after the self-study review, beginning in academic year 2017-18.
InTASC
Standards
Non-academics
Leadership & Collaboration
Non-academic Assessment
CAEP Standard 3.3 requires that the COE establishes and monitors “attributes and dispositions beyond academic ability that candidates must demonstrate at admissions and during the program.” The non-academic assessment of MSUB candidates is currently focused on their ability to work with students, families, and colleagues in ways that reflect the dispositions expected of highly qualified developing professionals. These dispositions are below.
Personal Professionalism: The candidate exhibits the behavior of an educational professional
· Professional dress and hygiene
· Responds to feedback appropriately
Professional Dispositions: The candidate demonstrates a belief that all students can learn and exhibits fairness in his/her actions
· Demonstrates an acceptance of differences in others
· Shows a respect of different perspectives
· Demonstrates a belief that all students can learn
· Uses fair procedures in dealing with others
· Reflects on and analyzes practice
· Adjusts lessons to deal with diverse learners
As evidenced in the expected dispositions, the COE has included the ideal of fairness and the belief that all students can learn. Other dispositions expected of candidates are tied back to the COE Mission. These dispositions are systematically assessed across the entire spectrum of the candidate’s development through observable behavior in varied educational settings, beginning in the Sophomore Field Experience, continuing into Junior Field Experience, and culminating in Student Teaching. All candidates are expected to develop plans for improvement related to their personal growth in their professional dispositions. For example, in Junior Field Experience, Candidates are required to reflect on their teaching lessons and make specific plans for improvement in the actual teaching episodes.
Monitoring the non-academic assessment tool, analyzing resulting data, and making recommendations based on the data is the purview of the CAEP Standard 3 Committee—that is, the CAEP Standard 1 Committee effectively subcontracts the CAEP Standard 3 Committee for this work as part of the Standard 3 Committee’s monitoring of the Quality, Selectivity, and Recruitment Phase- In Plan.
Assessment Plan, 2016-17
Note that this academic year’s plan is unique due to it being a self-study year in which the COE will have a site visit from both CAEP and Montana State Office of Public Instruction. CAEP Standard Committees were given a hiatus in after the Fall 2016 retreat in order to allow faculty more time to work in program groups and in order to allow the assessment coordinator and data specialist to finalize and submit the self-study report. CAEP Standard Committees will meet in January and May, 2017, in order to review their goals and finalize plans for the next academic year.
After this self-study year, annual Fall Retreats will focus on program-level review of data found in the Quality Assurance System Annual Report for the purpose of continuous program improvement.
Fall Retreat 2016: Focus on CAEP Accreditation Documentation
CAEP Standard 1 Committee
· Reviews Unit Assessment System
· Identifies methodology for review of instruments using CAEP’s rubric.
· Identifies gaps and/or redundancies in Professional Core Matrix
· Addresses gaps and/or redundancies in Professional Core Matrix
· Makes recommendations for additional (new) Key Assessments
1. Breadth and Depth of Curriculum (e.g., Lesson Planning)
2. College and Career Readiness
3. Ethics and School Law
4. Technology Standards
6. Course Grade Distribution Comparisons
CAEP Standard 2 Committee
· Reviews and updates Clinical Practice & Partnerships Selected Imp. Plan
· On what additional data should this plan be based?
· What is missing from the plan? (Especially as compared to Standard 2)
· Does the plan meet the requirements CAEP has set for plans?
· Begins discussing goals for the next academic year, identifying a way to measure what they seek to accomplish this year (based on their Plan).
CAEP Standard 3 Committee
· Documents past efforts in recruiting:
1. Representatively diverse Candidates (Aaniiih Nakoda grant)
2. Candidates in STEM areas (Noyce Scholars)
· Writes into Plan methodology for recruiting:
1. Representatively diverse Candidates
· Addresses gaps and/or redundancies in Professional Core Matrix
· Fall Semester 2016: Submit Self-Study Report while continuing planning and assessment.
· October: The first edition of the Quality Assurance System Annual Report is prepared by the Data Specialist and Assessment Coordinator.
· By December: Faculty teaching Professional Core Courses administer Key Assessments for student learning outcomes as noted on the Professional Core Matrix; data are gathered on hard copy forms and returned to the Data Specialist at the end of the semester. The Data Specialist works with the Assessment Coordinator to collect, manage, and analyze the data. (Data are analyzed and documented in the Quality Assurance System Annual Report throughout the year for presentation at the Fall 2017 Assessment Retreat.)
· The Data Specialist works with IT, the Licensure Specialist, and the Assessment Coordinator to gather data not embedded within singular courses: GPA calculations, course grade distribution comparisons, enrollment, demographics, retention, Praxis II, Montana State Three-Part Assessment scores, and metrics from the COE’s Quality Assurance System Plans. Data are analyzed and documented in the Quality Assurance System Annual Report throughout the year for presentation at the Fall 2018 Assessment Retreat.
· The Assessment Coordinator works with each CAEP Committee Chair to finalize narratives and Plans for inclusion in the Self-Study Report.
Spring Semester 2017: Site Visit is held while Committees complete first year of Plan implementation.
· Each CAEP Committee is responsible for progress on its role in the Unit Assessment System; CAEP Standard 3 Committee will present report of assessment of non- academic attributes and dispositions to the CAEP Standard 1 Committee for its review, in keeping with the Quality, Selectivity, and Recruitment Phase-In Plan. The CAEP Standard 2 Committee will present a report of its progress on the Clinical Practice and Partnerships Selected Improvement Plan. The COE Dean will present a report of the Montana Council of Deans’ progress toward the state-wide Program Impact Plan.
· Off-site CAEP Review will occur, with the Assessment Coordinator and COE Dean working with each CAEP Committee to provide additional artifacts as requested by the visiting team.
· On-site CAEP Review will occur.
Fall Retreat 2017: Faculty reflect on data reported from a comprehensive Quality Assurance System.
· Faculty will be presented with the first Quality Assurance System Annual Report. Data will be reported by program and InTASC Standard and evaluated for use in continuous improvement at both the College- and program-level.
· Each Committee will report on the prior year’s progress toward the Plans.
· Each Committee will articulate and prioritize goals for AY2017-18, based on the data resulting from the assessment of their Plans.
Data Collection and Management
Until fall 2008, the data collected were stored and analyzed in Excel. At that time, the Candidate Database was created. The Candidate Database was an Access database designed by the COE Dean’s Administrative Associate in conjunction with departmental Assessment Coordinators. The ACCESS database did not prove efficient because it allowed limited data entry by a few individuals. Working with IT, the COE transferred the database to SQL so that mentors and supervisors could enter evaluations of students with whom they were working. This proved to have limited sustainability as well because IT did not have the personnel resources to devote to the continual evolution of data requirements and assessment system changes.
During Spring 2014, the COE purchased Tk20. The first phase of development occurred during the summer, and the system was implemented Fall 2014 while still under development. Supervisors, mentor teachers, and counselors could enter data on site, and the Assessment Coordinator, with the Tk20 Unit Coordinator, were ostensibly able to generate reports for unit faculty to analyze and use for program improvement. However, Clinical Educators and Candidates experienced a number of issues using Tk20, and the reports generated by the system neither proved meaningful nor efficient.
Turnover in COE personnel and lack of customer service from Tk20 were coupled with the initial problems in Tk20 implementation, creating a system that was barely usable. In Fall 2015, the COE Dean made an executive decision to abandon Tk20 and move back to hard copy assessment forms. The decision was largely a relief to Clinical Educators and Candidates. Finding a new system is now a goal represented in the Clinical Practice and Partnerships Selected Improvement Plan; its appearance in that plan is a response to CAEP’s requirement for Clinical Educators to be involved in all aspects of co-creating the clinical experience. Clinical experiences can be stressful, and the COE is now resolved to find a technological solution to data collection, management, and analysis that does not cause additional stress on Clinical Educators and Candidates. P-12 Partners will be involved in the choice of any new technological solutions.
Appendix
Assessment of Content Pedagogy
Evidence of Professional Growth (Teacher Work Sample) Praxis II, GPA, and Course Comparison Protocol
Dispositions Observation
· Plan of Improvement [POI] Date POI Initiated: Date POI Completed:
Please use the following 1 to 5 rating scale to rate the candidate in each of the following areas. A rating of 1 on any indicator requires a written plan of improvement.
Professional Dispositions: The candidate demonstrates a belief that all students can learn, exhibits fairness in his/her actions, and demonstrates the behavior of a professional educator.
1
Sometimes misses scheduled days, but provides advanced notice
Always attends according to assigned schedule and has not missed any scheduled days
10 (j)
Is frequently late
Is on time in most situations, and provides notification if occasionally late.
Is always on time or appropriately early
10 (j)
Dresses professionally and has appropriate hygiene in all situations
10 (j)
Task Completion
Completes tasks responsibly and in a timely manner
Is proactive in completing tasks
10 (j, k)
Response to Feedback
Frequently seeks feedback and responds by reflecting upon and integrating feedback into future acts.
9 (i)
Fails to accept the differences of others
Accepts most differences in others
Consistently accepts and is respectful of differences in others
2 (b)
Regularly demonstrates consideration of other perspectives
Consistently acknowledges and respects different perspectives
3 (c)
Fails to demonstrate the belief that all students can learn
Regularly demonstrates the belief that all students can learn
Consistently includes all students in learning
1 (a)
Fails to use fair procedures in dealing with individuals
Regularly demonstrates fair and equitable treatment of individuals
Always treats individuals fairly and equitably
3 (c)
Reflection and Analysis of Teaching Practices
Fails to reflect on teaching practices.
Reflects on and analyzes teaching with regard to the whole group
Reflects on and analyzes teaching with regard to each student in the classroom
6 (f)
Lesson Planning for Diverse Learners
Fails to plan for the diversity of learners in a classroom
Plans lessons to meet the needs of the diversity of learners in a classroom
Plans and implements lessons to meet the needs of the diversity of learners in a classroom
7 (g)
(EDU 220)
Performance Evaluations
Site Supervisor
Please complete these forms which are required components of the Sophomore Field Experiences and
Candidate’s Name
Candidate’s ID#
Site
Evaluator:
Date
Please use the following 1 to 5 rating scale to rate the candidate in each of the following areas. A rating of 1 on any indicator requires a written plan of improvement.
Professional Dispositions: The candidate demonstrates a belief that all students can learn, exhibits fairness in his/her actions, and demonstrates the behavior of a professional educator.
1
Sometimes misses scheduled days, but provides advanced notice
Always attends according to assigned schedule and has not missed any scheduled days
Timeliness
Is frequently late
Is on time in most situations, and provides notification if occasionally late.
Is always on time or appropriately early
Professional Dress and Hygiene
Dresses professionally and has appropriate hygiene in all situations
Task Completion
Completes tasks responsibly and in a timely manner
Is proactive in completing tasks
Response to Feedback
Frequently seeks feedback and responds by reflecting upon and
inappropriately to feedback
1
Fails to accept the differences of others
Accepts most differences in others
Consistently accepts and is respectful of differences in others
Respect for Other’s Perspectives
Fails to consider other perspectives
Regularly demonstrates consideration of other perspectives
Consistently acknowledges and respects different perspectives
Belief that All Students Can Learn
Fails to demonstrate the belief that all students can learn
Regularly demonstrates the belief that all students can learn
Consistently includes all students in learning
Fairness in the Treatment of Individuals
Fails to use fair procedures in dealing with individuals
Regularly demonstrates fair and equitable treatment of individuals
Always treats individuals fairly and equitably
Reflection and Analysis of Teaching Practices
Fails to reflect on teaching practices.
Reflects on and analyzes teaching with regard to the whole group
Reflects on and analyzes teaching with regard to each student in the classroom
Candidate should be aware of this disposition, but it is not evaluated at the sophomore level.
Lesson Planning for Diverse Learners
Fails to plan for the diversity of learners in a classroom
Plans lessons to meet the needs of the diversity of learners in a classroom
Plans and implements lessons to meet the needs of the diversity of learners in a classroom
Candidate should be aware of this disposition, but it is not evaluated at the sophomore level.
Professional Dispositions:
Midterm Comments:
Final Comments:
Pre Student Teaching candidates are expected to demonstrate Developing Performance (2) on all Standards. Student Teaching candidates are expected to demonstrate Acceptable Performance (3) on all Standards.
Candidate
Signature (supervisor/mentor (cooperating teacher): Date:
Signature (candidate): Date:
RATING
STANDARD #2
Learning Differences
The teacher uses understanding of individual differences & diverse cultures & communities, particularly in Montana’s Indian Education for All, to ensure inclusive learning environments that enable each learner to meet high standards.
STANDARD #3
Learning Environment
The teacher works with others to create environments that support individual & collaborative learning, & that encourage positive social interaction, active engagement in learning, & self-motivation.
STANDARD #4
Content Knowledge
The teacher understands the central concepts, tools of inquiry, & structures of the discipline(s) he or she teaches & creates learning experiences that make these aspects of the discipline accessible & meaningful for learners to assure mastery of the content.
STANDARD #5
The teacher understands how to connect concepts & use differing perspectives to engage learners in critical thinking, creativity, & collaborative problem solving related to authentic local & global issues.
STANDARD #6
Assessment
The teacher understands & uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, & to guide the teacher’s & learner’s decision making.
STANDARD #7
Planning for Instruction
The teacher plans instruction that supports every student in meeting rigorous learning goals by drawing upon knowledge of content areas, curriculum, cross-disciplinary skills, & pedagogy, as well as knowledge of learners & the community context.
STANDARD #8
Instructional Strategies
The teacher understands & uses a variety of instructional strategies to encourage learners to develop deep understanding of content areas & their connections, & to build skills to apply knowledge in meaningful ways.
STANDARD #9
The teacher engages in ongoing professional learning & uses evidence to continually evaluate his/her practice, particularly the effects of his/her choices & actions on others (learners, families, other professionals, & the community), & adapts practice to meet the needs of each learner.
STANDARD #10
Leadership & Collaboration
Plan of Improvement
College of Education Licensure Standards & Clinical Practice
Site Supervisor – A rating of 1 on any indicator requires a written plan of improvement.
Please complete this form if the candidate is not meeting the dispositions or professional expectations of a sophomore field experience candidate.
Candidate’s Name
Candidate’s ID#
Print name and position
1. The evaluator is to identify specific areas for improvement which may include knowledge, skills, dispositions, professional expectations or any other area of concern applicable to the development of a beginning, professional educator.
2. Address each area; develop measurable goals and timelines (dates) of expected improvement(s).
3. Review the plan with the candidate, sign, and date it.
4. If applicable, attach copies of supporting documentation, anecdotal notes, lesson plans, etc.
5. A COPY OF THE PLAN OF IMPROVEMENT AND SUPPORTING DOCUMENTATION WILL BE PLACED ON FILE IN THE LICENSURE OFFICE.
Early Childhood Education P-3 or Elementary Education Assessment of Content Knowledge
Demonstrated During Student Teaching/ Clinical Practice
This evaluation is based on CAEP Standard 1, PEPPS 10.58.311, and InTASC Standards 4 and 5.
Candidates develop a deep understanding of the critical concepts and principles of their discipline and, by completion, are able to use discipline-specific practices flexibly to advance the learning of all P-12 students toward attainment of college- and career-readiness standards.
The Cooperating Teacher(s) completes this Content Validation Assessment on their Student Teacher Candidate. The College/University
Supervisor must review this assessment of a beginning teacher, make any pertinent comments at the bottom, and sign. Please evaluate the candidate based on the Montana Board of Public Education’s Administrative Rules of Montana Chapter 58 Professional Educator Preparation Program Standards (PEPPS) 10.58.531 Early Childhood Education P-3 and 10.58.532 Elementary Education as applicable to subjects being taught: Language Arts, Mathematics, Science, and Social Studies. Using the rubrics for each Indicator, record a score for each subject in the box provided.
3 = Advanced 2 = Proficient 1 = Basic 0 = Insufficient
Content Area _
3 Demonstrates advanced knowledge of content.
2 Demonstrates proficient content knowledge.
1 Uses basic content knowledge.
0 Uses inaccurate, insufficient content knowledge.
Language Arts Mathematics Science
Indicator B: Content alignment with identified objectives and standards.
3 Uses objectives and standards to make lessons meaningful to students.
2 Effectively use objectives and standards to develop the lesson.
1 Attempts to use objectives and standards to develop the lesson.
0 Is unable to use objectives and standards to develop a lesson.
Language Arts Mathematics Science
3 Uses additional resources beyond manual texts and curriculum guides.
2 Effectively use manuals, texts, and curriculum guides.
1 Demonstrates minimal use of instructional resources.
0 Is ineffective in using available instructional resources.
Language Arts Mathematics Science
2 Demonstrates acceptable research of topic content.
1 Demonstrates minimal research of topic content.
0 Demonstrates little or no research of topic content.
Language Arts Mathematics Science
Candidate ID# Grade Level(s)
Candidate Signature (I have been made aware of this assessment) Print Name Date
Cooperating Teacher Signature (I have completed this assessed) Print Name Date
University/ College Supervisor Signature (I have reviewed this assessment) Print Name Date
University/College Supervisor’s Comments:
Demonstrated During Student Teaching/ Clinical Practice
This evaluation is based on INTASC Standard #4: The student teacher understands the central concepts, tools of inquiry, and structure of the discipline(s) he or she teaches and can create learning experiences that make these aspects of the discipline accessible and meaningful for learners to assure mastery of the content.
The Cooperating Teacher(s) completes this Content Validation Assessment on their Student Teacher Candidate. The College/University Supervisor must review this assessment of a beginning teacher, make any pertinent comments at the bottom, and sign. Please evaluate the candidate based on the Montana Board of Public Education’s definition of content as found in Administrative Rules of Montana Chapter 58 Professional Educator Preparation Program Standards as applicable to the subject(s) being taught. Using the rubrics for each Indicator, record a score in the box provided.
3 = Advanced 2 = Proficient 1 = Basic 0 = Insufficient
Content Area:
3 Demonstrates advanced knowledge of content.
2 Demonstrates proficient content knowledge.
1 Uses basic content knowledge.
0 Uses inaccurate, insufficient content knowledge.
Indicator B: Content alignment with identified objectives and standards.
3 Uses objectives and standards to make lessons meaningful to students.
2 Effectively use objectives and standards to develop the lesson.
1 Attempts to use objectives and standards to develop the lesson.
0 Is unable to use objectives and standards to develop a lesson.
Indicator C: Accurate and current sources of information.
3 Uses additional resources beyond manual texts and curriculum guides.
2 Effectively use manuals, texts, and curriculum guides.
1 Demonstrates minimal use of instructional resources.
0 Is ineffective in using available instructional resources.
Indicator D: Content research to support lesson development.
3 Demonstrates in depth research of topic content.
2 Demonstrates acceptable research of topic content.
1 Demonstrates minimal research of topic content.
0 Demonstrates little or no research of topic content.
Signatures
Candidate ID# Grade Level(s)
Candidate’s Signature (I have been made aware of this assessment) Print Name Date
Cooperating Teacher’s Signature (I have completed this assessment) Print Name Date
University/College Supervisor’s Signature (I have reviewed this assessment) Print Name Date
University/College Supervisor’s Comments:
MSU BILLINGS COLLEGE OF EDUCATION EVIDENCE OF PROFESSIONAL GROWTH (EPG) ASSESSMENT RUBRIC
Candidate
EPG
First
Second
Third
Fourth
A lower rating of a 1 or a 2 on any performance indicator requires that an additional Evidence of Professional Growth (EPG) be completed and teaching performance observed.
1
2
3
4
1. Written Work Category Weight = 11%
Quality of Professional Writing in Terms of Mechanics, Communication of Ideas, and Completeness (Standard 4, 9)
Written work lacks appropriate quality across these measures: Mechanics, Communication of Ideas, and Completeness
Written work illustrates appropriate quality across one of these three measures: Mechanics, Communication of Ideas, and Completeness
Written work illustrates appropriate quality across two of these three measures: Mechanics, Communication of Ideas, and Completeness
Written work illustrates appropriate quality in Mechanics, Communication of Ideas, and Completeness
Written work illustrates professional quality in Mechanics (free of mechanical errors), Communication of Ideas (ideas are logically, correctly, and clearly presented), and Completeness (data and ideas come together in reasonable conclusions)
11%
School and Community Description (Standard 2)
No description of the relevant characteristics of the school and community
Minimal description of the relevant characteristics of the school and community
Partial description of the relevant characteristics of the school and community
Adequate description of the relevant characteristics of the school and community
Comprehensive description of the relevant characteristics of the school and community
3%
3%
3%
Lesson Objectives Identified (Standard 5, 6, 8)
No lesson objectives provided
Provides lesson objectives but the objectives are not clear and not measureable
Provides clear lesson objectives but are not aligned to curriculum/ standards
Provides clear and measurable lesson objectives that are aligned to appropriate curriculum/ standards
Provides a comprehensive set of clearly written, measurable lesson objectives that are directly aligned to appropriate curriculum/standards; objectives provide a framework around which the lesson is designed
5%
8)
No recognizable structure; procedures are not clear or fail to meet the needs of the learners
Recognizable structure; procedures are not clearly defined or the design does not meet the needs of the learners
Recognizable structure and procedures; designed to meet the needs of some of the learners
Recognizable structure and procedures; designed to meet the needs of most of the learners
Clearly defined structure and procedures; designed to meet the needs of all learners
10%
1
2
3
4
Instructional Materials and Resources (Standard 1, 2, 3)
Materials and resources are not suitable (age or developmentally appropriate) for students; do not support the lesson objectives or engage students in meaningful learning
Few materials and resources are suitable (age or developmentally appropriate) for students; few support the lesson objectives or are designed to engage students in meaningful learning
Some materials and resources are suitable (age or developmentally appropriate) for students; some support the lesson objectives and are designed to engage students in meaningful learning
Most materials and resources are suitable (age or developmentally appropriate) for students; most support the lesson objectives and are designed to engage students in meaningful learning
All materials and resources are suitable (age or developmentally appropriate) for students; all support the lesson objectives and are designed to engage students in meaningful learning; evidence of appropriate use of technology, if applicable
5%
Potential of Planned Activities to Engage Students in the Lesson Objective Concepts (Standard 1, 3, 5)
Planned activities are not aligned with identified lesson objectives
Planned activities are aligned with lesson objectives but have no potential to engage the students in concepts identified in lesson objectives
Planned activities are aligned with lesson objectives and have the potential to engage the students in lesson concepts at a few points during the lesson
Planned activities are aligned with lesson objectives and have the potential to engage the students in lesson concepts at many points during the lesson
Planned activities are aligned with lesson objectives and have the potential to engage the students in lesson concepts throughout the lesson
10%
4. Teaching the Lesson: Assessed by Formative Evaluation and Observation (not EPG)
1
2
3
4
Assessments Align with Lesson Objectives (Standard 5, 6, 8)
No assessments or assessments are not aligned with lesson objectives
Weak alignment between assessments and lesson objectives; not all lesson objectives are assessed
Some alignment between some assessments and lesson objectives; all lesson objectives are assessed
Adequate alignment between assessments and lesson objectives; all lesson objectives are assessed
Assessments and lesson objectives are fully aligned; all lesson objectives are assessed
5%
Validity of Assessments (Standard 5, 6, 8)
No assessments or assessments do not provide a useful measure of student understanding of lesson concepts
Assessments provide an inconsistent or weak measure of student understanding of lesson concepts
Assessments measure students understanding of lesson concepts, but the assessments are limited in their ability to do so across a broad range of understandings
Assessments measure students understanding of lesson concepts; assessments provide adequate and reasonable data across a broad range of understandings
Assessments are clear, concise, and free of ambiguity that would discourage validity; assessments provide rich student data across a broad range of understandings
10%
6, 8)
There is no evidence of pre and post assessment data
Pre and post assessment data is collected, but does not do so for all students and all lesson objectives
Pre and post assessment data is collected on all students, but data are not collected on all lesson objectives
Pre and post assessment data is collected on all students and all lesson objectives
Pre and post assessment data is collected on individual students and subgroups for all lesson objectives; presents data in a well-organized manner
5%
Fails to analyze data
Analyzes data but only for some students
Analyzes data for all students but only on some of the lesson objectives
Analyzes data for all students and all lesson objectives
Analyzes data for individual students and subgroups on all lesson objectives; presents analysis in a well- organized and coherent manner; provides conclusions for patterns in the data
10%
1
2
3
4
Insights on Effective Instruction (Standard 1, 2, 3,
7, 9)
Provides no rationale to explain why lesson activities were successful or unsuccessful
Provides a weak rationale to explain why lesson activities were successful or unsuccessful; rationale is simplistic or underdeveloped
Provides some rationale using student performance to explain why lesson activities were successful or unsuccessful; explores some reasons for student progress or lack thereof
Provides adequate rationale using student performance to explain why lesson activities were successful or unsuccessful; provides examples and explores plausible reasons for student progress or lack thereof
Clearly communicates rationale using student performance to explain why lesson activities were successful or unsuccessful; analyzes the value of activities and determines effective instruction based on individual student and subgroup learning results
5%
7, 9)
Evaluates assessment tools and techniques; provides no suggestions for improving the assessment tools and techniques
Evaluates assessment tools and techniques in response to student performance; provides limited suggestions for improving the assessment tools and techniques
Analyzes assessment tools and techniques in response to student performance; provides adequate and useful suggestions for improving the assessment tools and techniques
Analyzes assessment tools and techniques in response to student performance; analyzes assessment effectiveness in yielding useful data for guiding instruction; provides valuable suggestions for improving assessment tools and techniques
5%
7, 9)
Provides no ideas or inappropriate ideas for changing lesson objectives, instruction, and assessment
Provides ideas for changing lesson objectives, instruction, and assessment; provides no rationale why these changes would improve student learning
Provides some ideas for changing lesson objectives, instruction, and assessment; provides some rationale why these changes would improve student learning
Provides adequate ideas for changing lesson objectives and instruction based on assessment results; provides adequate rationale why these changes would improve student learning
Provides specific ideas for changing lesson objectives and instruction to improve student performance based on individual and subgroup assessment results; provides credible rationale based on data why these changes would improve student learning
5%
Evidence of Impact on Student Learning (Standard 5, 6, 8)
Fails to interpret data for the impact of instruction on student learning
Includes incomplete or superficial interpretation of data for the impact of instruction on student learning
Interprets data in a technically accurate way; conclusions regarding the impact of instruction on student learning are not fully supported
Interprets data in a technically accurate way; conclusions regarding the impact of instruction on student learning are supported
Clearly interprets data in a technically accurate way focusing on individual differences; conclusions regarding the impact of instruction on student learning are fully supported by the data
5%
· INTASC Standard #1: Learner Development: The teacher understands how learners grow and develop, recognizing that patterns of learning and development vary individually within and across the cognitive, linguistic, social, emotional, and physical areas, and designs and implements developmentally appropriate and challenging learning experiences.
· INTASC Standard #2: Learning Differences: The teacher uses understanding of individual differences and diverse cultures and communities to ensure inclusive learning environments that enable each learner to meet high standards.
· INTASC Standard #3: Learning Environments: The teacher works with others to create environments that support individual and collaborative learning, and that encourage positive social interaction, active engagement in learning, and self-motivation.
· INTASC Standard #4: Content Knowledge: The teacher understands the central concepts, tools of inquiry, and structures of the discipline(s) he or she teaches and creates learning experiences that make these aspects of the discipline accessible and meaningful for learners to assure mastery of the content.
· INTASC Standard #5: Application of Content: The teacher understands how to connect concepts and use differing perspectives to engage learners in critical thinking, creativity, and collaborative problem solving related to authentic local and global issues.
· INTASC Standard #6: Assessment: The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making.
· INTASC Standard #7: Planning for Instruction: The teacher plans instruction that supports every student in meeting rigorous learning goals by drawing upon knowledge of content areas, curriculum, cross- disciplinary skills, and pedagogy, as well as knowledge of learners and the community context.
· INTASC Standard #8: Instructional Strategies: The teacher understands and uses a variety of instructional strategies to encourage learners to develop deep understanding of content areas and their connections, and to build skills to apply knowledge in meaningful ways.
· INTASC Standard #9: Professional Learning and Ethical Practice: The teacher engages in ongoing professional learning and uses evidence to continually evaluate his/her practice, particularly the effects of his/her choices and actions on others (learners, families, other professionals, and the community), and adapts practice to meet the needs of each learner.
MSUB COE Quality Assurance System Fall 2016.docx
College of Education
Fall 2016
Table of Contents
Praxis Subject Assessment Prior to Program Completion 7
Assessment of Content Pedagogy 8
State Three-Part Assessment 13
Evidence of Professional Growth 52
Common Core Position Paper 86
Course Grade Comparisons 87
Enrollment and Demographics 97
Introduction
The following Quality Assurance System Annual Report contains data tables for academic years 2013-2014, 2014-2015, and 2015-2016. These data were compiled and analyzed in Summer and Fall 2016. Some data are from nationally known assessments or measures of candidate knowledge (e.g. GPA, ACT, and Praxis Subject Assessments). The Assessment of Content Pedagogy and the State Three-Part Assessment are Montana’s state-mandated assessments. Several of the data charts represent data from assessments created by Montana State University Billings (e.g. Dispositions Evaluation, Summative Evaluation, Evidence of Professional Growth, and Common Core Position Paper). There are also some charts comparing course grades for College of Education students and non-College of Education students at Montana State University Billings as well as some charts with enrollment and demographics. Montana State University Billings is dedicated to protecting individual student data; thus, student data is not reported in cases where the sample size (N) is less than 5.
The two ways data were analyzed:
1. Means and ranges. The COE used the customary method of analyzing data with N’s, averages, and ranges for the data. The black and white charts below in this report show these results.
2. Heat maps. Most COE assessment rubrics are scored on a Likert scale; looking for statistically significant differences on Likert scale differences is statistically inappropriate. Therefore, the COE decided to look at the data by comparing the distribution of candidate scores within each performance level on any given rubric. The colored heat maps in this report show percentages of candidates performing at each level on the Likert Scale, with the red or orange being a lower percentage and green representing the highest percentage for any given rubric criterion.
Cumulative GPA at Program Entry
CAEP Standards 1.1, 3.2, 3.4
Cumulative GPA at Admission to the MSUB College of Education for Initial Teacher Licensure Programs
Program
Timeframe
N
Program Average Cum GPA at Entry
College of Ed Overall Average Cum GPA at Entry for Same Timeframe
Range of that Program's GPA for Same Timeframe
ELEMENTARY EDUCATION
*Due to low N, this program's data is not reported.
**Due to low Ns, data for all programs besides Elementary Education has been aggregated to achieve N≥5
***Due to N<5, GPA for 2nd Majors in Secondary Education not reported, except for Broadfield Social Studies
GPA at Program Completion
CAEP Standards 1.1, 3.2, 3.4, 3.5
Cumulative GPA at Exit of MSUB College of Education for Initial Teacher Licensure Programs
Program
Timeframe**
N
Program Average Cumulative GPA at Exit
College of Ed Overall Average Cum GPA at Exit for Same Timeframe
Range of that Program's GPA for Same Timeframe
ELEMENTARY EDUCATION 
*Due to N < 5, this program's data is not reported.
**Due to low Ns, program data has been aggregated with the exception of Elementary Education K-8 and Special Education P-12
♦Due to N < 5, Cumulative GPA for 2nd Majors in Secondary Education Programs are not reported
Content GPA at Exit of MSUB College of Education for Initial Teacher Licensure Programs
Program
Timeframe
N
Program Average Content GPA at Exit
College of Ed Overall Average Content GPA at Exit for Same Timeframe
Range of that program's GPA for Same Timeframe
ELEMENTARY EDUCATION
*Due to N < 5, this program's data is not reported.
**Due to N < 5 for multiple semesters, this program's data has been disaggregated into two time periods
***Due to N < 5 for multiple semesters, this program's data has been aggregated
♦Due to N < 5, Content GPA for 2nd Majors in Secondary Education Programs are not reported
Praxis Subject Assessment Data
CAEP Standards 1.1, 2.3, 3.2, 3.3, 3.5, 5.4
MSUB COE Quality Assurance System Annual Report, Fall 2016, Page 2
Assessment of Content Pedagogy
CAPE Standards 1.1, 1.3, 2.3, 3.2, 3.5
The Assessment of Content Pedagogy (ACP) is a Montana state-mandated assessment. It was not created by the COE.
It is designed to be an evaluation by the candidates’ cooperating teachers during Student Teaching to evaluate the candidate content knowledge. Elementary Education candidates are assessed for each content area (Language Arts, Mathematics, Science, and Social Studies) in each indicator (Knowledge of Content, Content Alignment with Identified Objectives and Standards, Accurate and Current Sources of Information, and Content Research to Support Lesson Development) giving them four scores for each indicator—thus, a total of 16 scores appear for Elementary Education candidates. Secondary Education, Special Education, and Reading candidates are assessed once for each indicator, giving a total of four scores.
The following are four charts summarizing the ACP results: two charts for Elementary Education and two charts with the Secondary Education, Special Education and Reading results. The ACP is one of three parts in the State Three-Part Assessment and is required for licensure.
State Three-Part Assessment
CAEP Standards: 1.1, 1.2, 1.3, 1.4, 1.5, 2.3, 3.5, 3.6, 5.3, 5.4
The State of Montana requires all teacher candidates to pass the Montana Assessment of Content Knowledge with at least a score of 7 to be recommended for licensure by an accredited Montana postsecondary education preparation program.
The Montana Assessment of Content Knowledge is made of three parts (i.e., it is the “State Three-Part Assessment”): the candidate content GPA, score on the Assessment of Content Pedagogy (ACP), and Praxis score.
The possible range for the Content Knowledge Score (CKS) is 0-11. Teacher candidates scoring lower than CKS=7, or who score zero on any of the three multiple measures, are not recommended for licensure.
ACT
CAEP Standard 3.2
The ACT is not required for MSUB admission or COE admission. However, many candidates submit ACT scores, and the COE tracks these scores as shown on the following pages.
Montana’s Board of Regents is currently moving forward with requiring all high school juniors to take the ACT, which will allow the COE to begin tracking the CAEP Standard 3.2 requirement that the candidate group average score be in the top 50% of national performance beginning in 2016-17.
Currently, the COE benchmarks available ACT scores against the MT average for high school graduates of any given assessment cycle. The group average of COE candidate scores exceeds the Montana ACT average for every year indicated.
Dispositions Evaluations
CAEP Standards 2.3, 3.3, 3.6, 4.2
The Dispositions Evaluation is a Montana State University Billings created key assessment to assess candidates on their professional dispositions. Cooperating Teachers administer the evaluation. It is administered to sophomores in EDU 220, to juniors during Junior Field, and to seniors during Student Teaching. In EDU 220, starting Fall 2014 and continuing to the present, candidates were given two instances of the Dispositions Evaluation. Jr. Field and Student Teaching candidates are usually evaluated once. A rating of “Need-for-Improvement” on any indicator spurs a plan of improvement and could result in additional administration of the Dispositions Evaluation.
Summative Evaluation of Clinical Experiences
CAEP Standards 1.1, 1.2, 1.3, 1.4, 1.5, 2.3, 3.4, 4.2
The Summative Evaluation is a Montana State University Billings created key assessment. This assessment is given to candidates during Junior Field and Student Teaching. Candidates are evaluated by their University Supervisors and their Cooperating Teachers. This assessment is designed to assess candidate performance on the 10 InTASC standards. Prior to receiving a summative evaluation, candidates are assessed formatively using a qualitative instrument.
Evidence of Professional Growth
CAEP Standards: 1.1, 1.2, 1.3, 1.4, 1.5, 2.3, 3.4, 3.6
The Evidence of Professional Growth (EPG) is a Montana State University Billings created key assessment. It is a teacher work sample designed to show candidate use of research and evidence to develop and understanding of the profession, enhance their practice, and apply knowledge and skills for the purpose of positive impact on P12 learning.
EPGs are evaluated by the university supervisors and MSUB clinical faculty. During Junior Field, candidates complete one EPG. In Student Teaching, all candidates complete at least two EPGs in an effort to show growth between their first and second attempt. In the Student Teaching reports below, the #1 and #2 indictors denote the first or second instance of EPG administration. Candidates are ordinarily required to complete more than two EPGs if low scores persist on the second attempt. Due to low N of candidates completing more than two EPGs, these data are not publicly reported. The following data reports reflect improvements to the instrument, meaning that from Fall 2013-Spring 2014 one version of the EPG rubric was used; then the EPG underwent revision to improve its validity and actionability—thus, there are different charts below for the different versions of the EPG that were used during the different time frames. Reports for Fall 2014-Spring 2016 reflect the current version of the EPG rubric. There is also a section for Elementary Education Jr. Field for Fall 2015-Spring 2016, as a slightly different version was used during that time frame.
Common Core Position Paper
CAEP Standards 1.2, 1.4, 3.4, 3.6
The Common Core Position Paper is a pilot assessment mainly intended to address candidate understanding of and commitment to the Common Core. It is administered by the instructor during EDU406, which focuses on the philosophy and legal and ethical issues of the teaching profession. The assignment has not yet been subjected to tests of reliability and validity. (See the COE Reliability and Validity Plan.) The chart below is the initial grade distribution for this paper, which will be used as a baseline for future comparisons.
Course Grade Comparisons
YYYY30=Spring (also includes winter intersession), YYYY50=Summer, YYYY70=Fall
CAEP Standards 3.2, 3.4, 3.5, 3.6
The Course Grade Comparisons compare average grades in courses between Education and Non-Education majors. The green highlighted rows are instances where Education majors had higher average grades for that particular course. In instances where N<5, the average and range are not reported. Fall 2016 is the first semester these data have been made available to the COE and is an initial benchmark.
Enrollment and Demographics
CAEP Standards 3.1
Degree
Major
Fall
2010
Fall
2011
Fall
2012
Fall
2013
Fall
2014
Fall
2015
 
 
 
 
 
2
3
9
3
8
7
-1
-12.50%
BS
29
30
21
16
19
21
2
10.53%
READING
BS
Reading
10
7
10
12
14
13
-1
-7.14%
3
5
4
3
3
3
0
0.00%
Degree
Major
Fall
2010
Fall
2011
Fall
2012
Fall
2013
Fall
2014
Fall
2015
1
1
1
2
3
5
2
66.67%
22
59
66
56
60
52
-8
-13.33%
 
 
 
 
 
 
 
 
 
 
 
Semester Headcount, 2010 to 2016
ADHD
Bottom Section indicates enrolled candidates
CAEP Standard 3.1
These data serve as the baseline for the COE Quality, Selectivity, and Recruitment Plan
Undergraduate Demographics, Fall 2015
Bottom Section indicates enrolled candidates
CAEP Standard 3.1
These data serve as the baseline for the COE Quality, Selectivity, and Recruitment Plan
Undergraduate Common Core Analysis Paper F15 - X16
A Range
F15-800 (22) F15-01 (14) S16-01 (24) S16-02 (10) X16-01 (14) 14 8 17 3 10 B Range
F15-800 (22) F15-01 (14) S16-01 (24) S16-02 (10) X16-01 (14) 4 6 5 6 2 C Range
F15-800 (22) F15-01 (14) S16-01 (24) S16-02 (10) X16-01 (14) 4 2 2 Below C
F15-800 (22) F15-01 (14) S16-01 (24) S16-02 (10) X16-01 (14) 1
Grades
NUnacceptableBasicProficientAdvancedNUnacceptableBasicProficientAdvancedNUnacceptableBasicProficientAdvancedNUnacceptableBasicProficientAdvanced
Fall 2013390.00%5.13%35.90%58.97%390.00%0.00%25.64%74.36%390.00%0.00%33.33%66.67%390.00%2.56%43.59%53.85%
Spring 2014420.00%4.76%54.76%40.48%420.00%0.00%35.71%64.29%420.00%2.38%28.57%69.05%420.00%0.00%50.00%50.00%
Fall 2014360.00%2.78%50.00%47.22%360.00%5.56%27.78%66.67%360.00%2.78%16.67%80.56%360.00%2.78%44.44%52.78%
Spring 2015330.00%9.09%33.33%57.58%340.00%5.88%17.65%76.47%340.00%5.88%38.24%55.88%340.00%2.94%47.06%50.00%
Fall 2015330.00%0.00%33.33%66.67%330.00%3.03%15.15%81.82%330.00%3.03%18.18%78.79%330.00%3.03%36.36%60.61%
Spring 2016390.00%0.00%48.72%51.28%390.00%0.00%20.51%79.49%390.00%2.56%15.38%82.05%390.00%0.00%41.03%58.97%
Fall 2013380.00%2.63%57.89%39.47%380.00%0.00%26.32%73.68%380.00%0.00%28.95%71.05%380.00%2.63%47.37%50.00%
Spring 2014400.00%10.00%32.50%57.50%400.00%2.50%30.00%67.50%400.00%0.00%22.50%77.50%400.00%2.50%42.50%55.00%
Fall 2014360.00%8.33%30.56%61.11%360.00%0.00%22.22%77.78%360.00%2.78%25.00%72.22%360.00%2.78%41.67%55.56%
Spring 2015330.00%3.03%42.42%54.55%330.00%6.06%21.21%72.73%310.00%0.00%25.81%74.19%330.00%3.03%48.48%48.48%
Fall 2015330.00%0.00%48.48%51.52%330.00%3.03%27.27%69.70%330.00%3.03%21.21%75.76%330.00%3.03%39.39%57.58%
Spring 2016390.00%2.56%46.15%51.28%390.00%0.00%28.21%71.79%390.00%0.00%38.46%61.54%390.00%0.00%43.59%56.41%
Fall 2013360.00%0.00%75.00%25.00%360.00%0.00%47.22%52.78%350.00%0.00%25.71%74.29%350.00%2.86%45.71%51.43%
Spring 2014370.00%5.41%48.65%45.95%370.00%0.00%54.05%45.95%370.00%0.00%37.84%62.16%370.00%0.00%54.05%45.95%
Fall 2014310.00%0.00%45.16%54.84%310.00%3.23%25.81%70.97%300.00%3.33%30.00%66.67%300.00%3.33%43.33%53.33%
Spring 2015310.00%6.45%45.16%48.39%310.00%3.23%32.26%64.52%310.00%3.23%35.48%61.29%310.00%3.23%45.16%51.61%
Fall 2015330.00%0.00%42.42%57.58%330.00%3.03%24.24%72.73%320.00%3.13%21.88%75.00%320.00%3.13%40.63%56.25%
Spring 2016340.00%0.00%47.06%52.94%330.00%0.00%42.42%57.58%330.00%0.00%42.42%57.58%340.00%0.00%44.12%55.88%
Fall 2013330.00%3.03%66.67%30.30%330.00%3.03%36.36%60.61%320.00%0.00%28.13%71.88%333.03%3.03%45.45%48.48%
Spring 2014370.00%2.70%51.35%45.95%380.00%0.00%55.26%44.74%380.00%0.00%50.00%50.00%370.00%0.00%59.46%40.54%
Fall 2014320.00%0.00%50.00%50.00%310.00%0.00%38.71%61.29%300.00%3.33%23.33%73.33%280.00%3.57%39.29%57.14%
Spring 2015310.00%9.68%32.26%58.06%310.00%3.23%25.81%70.97%310.00%3.23%38.71%58.06%310.00%3.23%48.39%48.39%
Fall 2015320.00%0.00%43.75%56.25%320.00%3.13%25.00%71.88%320.00%3.13%15.63%81.25%320.00%3.13%37.50%59.38%
Spring 2016330.00%0.00%42.42%57.58%300.00%3.33%33.33%63.33%320.00%0.00%31.25%68.75%330.00%0.00%45.45%54.55%
Science
InTASC Std 4InTASC Std 4InTASC Std 7InTASC Std 4
Assessment of Content Pedagogy Elementary Education
Knowledge of Content
Standards
Accurate and Current Sources of InformationContent Research to Support Lesson Development
Rubric Section
InTASC Std 4InTASC Std 4InTASC Std 7InTASC Std 4
Assessment of Content Pedagogy Elementary Education
Knowledge of Content
Information
Development
NUnacceptableBasic Proficient Advanced NUnacceptableBasic Proficient Advanced NUnacceptableBasic Proficient Advanced NUnacceptableBasic Proficient Advanced
Biology 5-12 4****4****4****4****
Broadfield Science 5-12 60.00%0.00%50.00%50.00%60.00%0.00%33.33%66.67%60.00%0.00%16.67%83.33%60.00%0.00%16.67%83.33%
Chemistry 5-122****2****2****2****
Music K-12 110.00%9.09%36.36%54.55%119.09%0.00%9.09%81.82%110.00%9.09%9.09%81.82%119.09%0.00%27.27%63.64%
Physics 5-121****1****1****1****
Social Studies 5-122****2****2****2****
Spanish K-12 60.00%0.00%100.00%0.00%60.00%16.67%50.00%33.33%60.00%0.00%16.67%83.33%60.00%0.00%66.67%33.33%
Art K-12 70.00%0.00%42.86%57.14%70.00%0.00%28.57%71.43%70.00%0.00%42.86%57.14%70.00%0.00%28.57%71.43%
English 5-12 130.00%7.69%30.77%61.54%130.00%0.00%23.08%76.92%130.00%0.00%7.69%92.31%130.00%0.00%46.15%53.85%
Health & Human Performance K-12100.00%0.00%30.00%70.00%100.00%0.00%40.00%60.00%100.00%0.00%20.00%80.00%100.00%0.00%50.00%50.00%
History 5-12 120.00%0.00%41.67%58.33%130.00%0.00%46.15%53.85%130.00%0.00%0.00%100.00%130.00%0.00%53.85%46.15%
Mathematics 5-12 80.00%12.50%12.50%75.00%80.00%0.00%50.00%50.00%80.00%0.00%37.50%62.50%80.00%0.00%62.50%37.50%
Reading K-12 90.00%0.00%33.33%66.67%90.00%0.00%11.11%88.89%90.00%0.00%33.33%66.67%90.00%0.00%33.33%66.67%
Art K-12 50.00%0.00%20.00%80.00%50.00%0.00%40.00%60.00%50.00%0.00%20.00%80.00%50.00%0.00%20.00%80.00%
English 5-12 130.00%0.00%7.69%92.31%130.00%0.00%15.38%84.62%130.00%0.00%0.00%100.00%130.00%0.00%7.69%92.31%
Health & Human Performance K-12130.00%0.00%15.38%84.62%130.00%0.00%38.46%61.54%130.00%0.00%23.08%76.92%130.00%0.00%38.46%61.54%
History 5-12 160.00%0.00%25.00%75.00%160.00%6.25%25.00%68.75%160.00%0.00%31.25%68.75%160.00%0.00%31.25%68.75%
Mathematics 5-12 100.00%10.00%30.00%60.00%100.00%10.00%20.00%70.00%100.00%0.00%20.00%80.00%100.00%10.00%30.00%60.00%
Reading K-12 120.00%0.00%33.33%66.67%120.00%0.00%25.00%75.00%120.00%0.00%25.00%75.00%120.00%0.00%58.33%41.67%
Special Education P-12 (2013-2014)140.00%0.00%35.71%64.29%140.00%0.00%7.14%92.86%140.00%0.00%7.14%92.86%140.00%0.00%28.57%71.43%
Special Education P-12 (2014-2015)210.00%4.76%47.62%47.62%210.00%0.00%28.57%71.43%210.00%0.00%9.52%90.48%210.00%0.00%33.33%66.67%
Special Education P-12 (2015-2016)210.00%0.00%33.33%66.67%210.00%0.00%14.29%85.71%210.00%0.00%19.05%80.95%210.00%0.00%52.38%47.62%
*Due to N<5, program data is not reported.
Fall 2013-Fall 2014
Spring 2015-Spring 2016
Fall 2013-Spring 2016
InTASC Std 4InTASC Std 4InTASC Std 7InTASC Std 4
Assessment of Content Pedagogy Secondary, Reading, and Special Education
Knowledge of ContentContent Alignment with Identified Objectives and StandardsAccurate and Current Sources of InformationContent Research to Support Lesson Development
Rubric Section
*Due to N<5, p

Recommended