+ All Categories
Home > Documents > Periodic Program Review

Periodic Program Review

Date post: 23-Mar-2016
Category:
Upload: bao
View: 52 times
Download: 0 times
Share this document with a friend
Description:
Periodic Program Review. Guiding Programs in Today’s Assessment Climate . LaMont Rouse Executive Director of Assessment, Accreditation & Compliance. Goals of the Presentation. Describe the periodic program review process; Focus on PPR and Middle States; - PowerPoint PPT Presentation
Popular Tags:
27
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance
Transcript
Page 1: Periodic Program Review

Periodic Program Review

Guiding Programs in Today’s Assessment Climate

LaMont RouseExecutive Director of Assessment,

Accreditation & Compliance

Page 2: Periodic Program Review

Goals of the Presentation■ Describe the periodic program review process;■ Focus on PPR and Middle States;■ Review core elements of the PPR process;■ Discuss the new era of accountability in higher

education;■ Define what “good” assessment is;■ Documenting assessment of student learning;■ Closing the Loop: Using data to make and

inform decisions and resource allocation;■ Q&A

Page 3: Periodic Program Review

Purpose of Program Review

Strengthen and Validate Programs

■ What are the intended learning outcomes?

■ Do students know the intended learning outcomes of the program? Does the public?

■ Are students achieving these outcomes?

■ How do we prove it?■ How can we strengthen

and support programs?

Page 4: Periodic Program Review

Purpose of Program Review

Align Resources to Data and Outcomes

■ What does the data tell us?■ How can we improve student

learning?■ How can we improve teaching

effectiveness?■ How do we take the program to the

next level?■ How does the program support

Cedar Crest’s mission?

Page 5: Periodic Program Review

Purpose of Program ReviewMiddle States Commission on

Higher Education

■ Demonstrates a model for institutional effectiveness;

■ MSCHE considers program review part of the new era of accountability;

■ Colleges must begin to report regularly to their various constituents (students, alumni, employees, employers, the federal government, the state, etc.)

■ The demands for accountability are growing stronger, not weaker.

Page 6: Periodic Program Review

New Era of Accountability The Big Picture

• PPR will strengthen programs;

• PPR will lead to sustained enrollment growth at the college;

• PPR will help align the College with the best practices in higher education;

• PPR will allow Cedar Crest to tell its story in a more cohesive fashion.

Page 7: Periodic Program Review

Key Features of PPR

Helps Programs Continuously Update their

• Mission statements

• Student learning outcomes

• Curriculum maps

Page 8: Periodic Program Review

Key Features of PPR

Provides the following:

Data on student learning outcomes;

Provides a quantitative record of accomplishments;

The data should ultimately suggest the next steps in the maturation of the program.

Page 9: Periodic Program Review

Key Features of PPR Provides the following:

• Strengthens programs by seeking alignment with the best practices in the field.

• Helps align resources to stated needs.

• Helps faculty and administrators see trends in a more coherent fashion.

Page 10: Periodic Program Review

The Role of Chairs • Chairs assign duties as

needed to complete the program review report.

• Works with the Office of Assessment and the Assessment Committee to ensure the PRR schedule is maintained.

• Notifies the Office of Assessment when issues arise.

Page 11: Periodic Program Review

The Role of Chairs Best Practices

• Review with faculty the program’s mission statement and student learning outcomes before the full review begins.

• Read though prior years’ assessment reports. Ensure that all SLOs have been assessed within the last 3 years.

• Create a central location to collect/gather data.

Page 12: Periodic Program Review

The Role of Chairs Best Practices

• Build trust

• Open communication

• Stay on schedule

Page 13: Periodic Program Review

The Role of Chairs/Time Line

Phase I (September-October)

• Go to the program review orientation sponsored by the Office of Assessment

• Gather assessment reports from the last 3 to 5 years (as appropriate)

• Assign appropriate roles for the completion of the task.

Page 14: Periodic Program Review

The Role of Chairs/Time Line

Phase 2 (November-January)

• Start placing documents into the appropriate section of the binder.

• Contact Institutional Research & the Office of Assessment for any additional data.

• Complete sections 1,2,4,8 &10

Page 15: Periodic Program Review

The Role of Chairs/Time Line

Phase 3 (February - April)• Provide an update to the

Office of Assessment and the Assessment Committee

• By the first week of February, complete all data gathering activities.

• Write and complete the remaining sections of the report.

Page 16: Periodic Program Review

The Role of Chairs/Time Line

Phase 4 (May - July)• Complete the report and

submit it to the Office of Assessment and the Assessment Committee

• Make suggestions that will support your program through the next cycle.

• Make suggestions (if any) for curricular changes and modifications

• Meet with the Assessment committee to review the report.

Page 17: Periodic Program Review

Questions

Page 18: Periodic Program Review

What is “Good” Assessment?

Start with clear statements of the most important things you want students to learn from the course, program, or curriculum.

Teach what you are assessing. Help students learn the skills needed to do the assessment task. For example, if you are giving a writing assignment, help your students understand how you define good writing and give them feedback on drafts.

Because each assessment technique is imperfect and has inherent strengths and weaknesses, Collect more than one kind of evidence of what students have learned.

Page 19: Periodic Program Review

What is “Good” Assessment?

Make assignments and test questions crystal clear. Write them so that all students will interpret them in the same way and know exactly what they are expected to do.

Before creating an assignment, write a rubric: a list of the key things you want students to learn by completing the assignment and to demonstrate on the completed assignment.

Likewise, before writing test questions, create a test “blueprint”: a list of the key learning goals to be assessed by the test and the number of points or questions to be devoted to each learning goal.

Page 20: Periodic Program Review

What is “Good” Assessment?

Collect enough evidence to get a representative sample of what your students have learned and can do. Collect a sufficiently large sample that you will be able to use the results with confidence to make decisions about your course, program, or curriculum.

Score student work fairly and consistently. Before scoring begins, have a clear understanding of the characteristics of meritorious, satisfactory, and inadequate papers. Then use a rubric to help score assignments, papers, projects, etc., consistently.

Use assessment results appropriately. Never base any important decision on only one assessment. (Failure to adhere to this maxim is one of the major shortcomings of many high-stakes testing programs.) Assessments shouldn’t make decisions for us or dictate what we should teach; they should only advise us as we use our professional judgment to make suitable decisions. .

Page 21: Periodic Program Review

What is “Good” Assessment?

Features

• Each program should have 4 to 6 primary student learning outcomes.

• Each program should describe at least one direct and indirect method for capturing data for each SLO..

• Every program should be assessing at least one or two of their SLOs each year.

Page 22: Periodic Program Review

Documenting Assessment

• Data should be gathered from rubrics, in house tests, external products including surveys, etc.

• A plan for collecting this data must be established along with roles and responsibilities.

• Data should be shared across the department. Transparency is key.

Page 23: Periodic Program Review

Documenting Assessment

• Use templates as much as

possible when reporting data.

• Gather a little bit of data each year.

• Strategically map your assessment plan between PPRs.

• Find balance between gathering “good enough” evidence and a sustainable system.

Page 24: Periodic Program Review

Closing the Loop/Meaningful Modifications

Closing the Loop: A step in an institutional effectiveness or assessment cycle. Many people think of closing the loop as a final step; it is, however, a step that is used to begin the cycle again, based on an analysis of what has been accomplished and learned up to that point. At Cedar Crest, assessment measures are used to determine program strengths and challenges. Analysis follows and leads to some decision about improving the program or continuing the program without change. It is that decision that serves to close the loop.

Page 25: Periodic Program Review

Closing the Loop/Meaningful Modifications

• What do the findings tell us now?

Did our actions improve learning?

What else do the findings show?

What’s the next step?

What have we learned about our assessment process?

What can be improved?

Page 26: Periodic Program Review

Closing the Loop/Meaningful Modifications

• Must document the closing the loop actions

• Must allocate resources – in part – based on these closing the loop actions.

• Have to demonstrate that we are a “learning” and adaptive organization.

Page 27: Periodic Program Review

Thank you.


Recommended