+ All Categories
Home > Documents > Assessment & Review of Graduate Programs Duane K. Larick, NC State University David L. Wilson,...

Assessment & Review of Graduate Programs Duane K. Larick, NC State University David L. Wilson,...

Date post: 20-Dec-2015
Category:
View: 215 times
Download: 0 times
Share this document with a friend
Popular Tags:
114
Assessment & Review of Graduate Programs Duane K. Larick, NC State University David L. Wilson, Southern Illinois University Carbondale Council Of Graduate Schools Pre-Meeting Workshop December 8, 2004
Transcript

Assessment & Review of Graduate Programs

Duane K. Larick, NC State University

David L. Wilson, Southern Illinois University Carbondale

Council Of Graduate SchoolsPre-Meeting Workshop

December 8, 2004

Guidelines for This Presentation

Please turn off or silence you cell phones Please feel free to raise questions at anytime

during the presentation We have included a set of discussion

questions along the way We will also leave time at the end for general

discussion. We are very interested in your participation

Agenda Introduction and Objectives Overview of Graduate Program Review

Reasons for Graduate Assessment General Process of Program Review

Process or Processes for Development of a Program Review Procedure External program review Outcome based – continuous & ongoing review Comparative Data Sources

Case Studies Southern Illinois University Carbondale North Carolina State University

Summary and Discussion

Objectives

Discuss various motivators for undertaking graduate assessment

Increase overall awareness of recent trends in Graduate Program Review

Demonstrate practical experience/knowledge gained related to development and implementation of external reviews and outcome-based continuous and ongoing procedures for Graduate Program Review

Illustrate examples of data and managerial tools developed/utilized to improve the efficiency of the process

Why Assess Graduate Programs (external drivers)?

Improvement in the quality of graduate education

To help satisfy calls for accountability Especially at the State level

Requirement for regional accreditation, licensure, etc.

Why Assess Graduate Programs (internal drivers)?

Meet short-term (tactical) objectives or targets

Meet long-term (strategic) institutional/departmental goals Funding allocation/reallocation

Funded project evaluation (GAANN, IGERT)

Understand sources of retention/attrition among students and faculty

                                                                                                                                                                                           

Accreditation Agencies

Southern Association of Colleges and Schools Western Association of Colleges and Schools Northwest Association of Colleges and Schools North Central Association New England Association of Schools and

Colleges Middle States Commission on Higher

Education

SACS Principles of Accreditation

Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”

SACS Criterion for Accreditation

Section 3 – Comprehensive Standards - #16 “The institution identifies outcomes for

its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.

SACS Principles of Accreditation

Section 3 – Comprehensive Standards: Standards for All Educational Programs “12. The institution places primary

responsibility for the content, quality, and effectiveness of its curriculum with the faculty”

“18. The institution ensures that its graduate instruction and resources foster independent learning, enabling the graduate to contribute to a profession or field of study.”

Northwest Association of Colleges and Schools

Standard 2.B The institution identifies and publishes the

expected learning outcomes for each of its degree programs

The institution’s processes for assessing its educational programs are clearly defined, encompass all of its offerings, are conducted on a regular basis, and are integrated into the overall planning & evaluation plan

The institution provides evidence that its assessment activities lead to the improvement of teaching and learning

Intent of Accreditation Agency Effort

The intent of the accrediting agencies is to “encourage” institutions to create an environment of planned change for improving the educational process.

State Mandated Reviews and Assessment

Illinois Board of Higher Education’s Priorities, Quality, and Productivity (P.Q.P.) Initiative (1992)

IBHE’s Framework for Reviewing Priorities, Productivity, and Accountability (PPA) in Illinois Higher Education (December 2003)

So, The Questions We Need To Ask Ourselves Are?

What are we currently doing? Why are we currently doing it? Is what we are currently doing accomplishing

the external goals just described above? Is what we are currently doing accomplishing

the internal goals described above? Is there a “better” way? Who defines better?

General Procedure(s) for Review of Graduate Programs

External program review conducted on a 5 – 10 year cycle Standard practice at most Institutions

Outcome-based continuous and ongoing program review Being implemented by many in response to

regional and state accreditation requirements and institution needs

Operational Procedures: 8 - 10 year review cycle Components

Internal self-study External “team” review

Review team’s report Program’s response Administrative Meeting

General Process for External Reviews

Administration:: Typically Administered by the Dean of the

Graduate School or centrally through the Provost’s Office

Initiated by program or the administrating office

Often conducted at the Department level Includes multiple degrees/programs

General Process for External Reviews

Typical Objectives for External Reviews

Reviews are conducted to gain a clearer understanding of a program’s:Purpose(s) within the InstitutionEffectiveness in achieving purposesOverall qualityFuture objectivesChanges needed to achieve objectives

General Process for External Reviews

Information Made Available at the Institution Level (examples include):Enrollment: numbers, demographicApplications

numbers applied/admitted/enrolled quality indicators

Number of degrees awarded, time to degreeFinancial supportExit interviews

General Process for External Reviews

Self-Study: Purpose

Encourage “stakeholders” in a thoughtful and creative study and evaluation of the program’s academic performance in relation to the Institution’s mission

Philosophy Review must cover all components of the

program’s mission teaching, research, and outreach

General Process for External Reviews

Key Self-Study Components: Program description – including

objectives Faculty – distribution & quality Students – need, enrollment, quality,

degrees granted, support Curriculum/Instruction Masters & Doctoral degrees granted

General Process for External Reviews

Key Self-Study Components: Teaching, research, and service

participation Current research – national comparison,

external support, interdisciplinary projects Methods for internal program review Recent changes & why Strengths, Weakness & Opportunities

General Process for External Reviews

Review Team Make-up: On-Campus Representation

Often a Graduate School and/or Graduate Faculty Representative

One or more off-campus external experts Depends on scope of program(s) being

reviewed Can add to expense

General Process for External Reviews

Review Team Visit: Often 2-4 days in length Generally meet with University and

College administration in addition to faculty and students

General Process for External Reviews

Review Team Report: Generally includes some form of an

analysis of the strengths, weakness, opportunities for and needs of the graduate program from the perspective of their peers

General Process for External Reviews

Final Administrative Meeting: Final meeting to discuss the

“outcome(s)” of the review Should include proposed action items

with a follow-up schedule

Discussion Questions?

How many of your institutions have a graduate program review process similar to what was just described?What are some of the variations that exist?How often or what is the frequency of review – remember the words “continuous improvement”

Discussion Questions? continued

Who should coordinate the review of graduate programs? What should the role of the Graduate School be?Should the external review be comprehensive in nature i.e. encompass all roles of the program?Should the review be tied to other reviews – licensure, accreditation, etc.?Who pays for the external review and how much is reasonable?

Outcome-Based, Continuous and Ongoing Review of Graduate Programs

There are fewer Institutional models or norms to go by when it comes to designing and implementing this type of review process

Goal is generally to establish an outcomes-based program that is continuous rather than sporatic

The program periodically reports the nature and outcomes of the review process to the Institution and appropriate external agencies (State, accreditation agencies, etc)

Results are used by the program and Institution for planning purposes

What Is Outcomes-Based Assessment?

The process of (1) determining the indicators of an effective program, (2) using those indicators as criteria for assessing the program, and (3) applying the results of the assessment toward the ongoing and continuous improvement of the program.

Shift to student learning centered concerns “What do we want our students to

know?” “How well does the program promote

learning?” Moves from the “quality” of presentation to

“How well did the student learn it?” Assesses achievement of the outcomes on

a continuous rather than episodic basis

What Is Outcomes-Based Assessment?

Potential Benefits of Assessment Planning Process

Gives faculty a voice in defining the program and thus a stake in the program

Gives faculty an investment in assessing the program

Provides faculty-approved indicators for gauging and improving the effectiveness of the program

Where Do We Start When Considering an Outcome-Based Process?

It Sometimes Helps to Ask the Following Questions? Do our graduate programs have a clearly stated objectives? Do we have departmental plans to evaluate the

effectiveness of our degree programs? Do our degree programs have clearly defined faculty

expectations (outcomes) for students? Are they measurable or observable? Do we use data to assess the achievement of faculty

expectations for students? Do we make changes in our programs based on the

outcomes of these assessments? Do we document that assessment is done and results

are used to make change?

Outline of the Process

Development of program specific objectives Development of program specific outcomes Development of an assessment plan or a

“Schedule” for assessing and reporting outcomes

Development of the necessary database at the program and institutional level

Development of the appropriate managerial tools

Keys to Success

The department should want to do this process The department must use the information

collected Demonstrate change as a result of findings

The institution must use the information collected It should somehow tie to resource decisions

Use participation in the process as part of faculty reviews

What Are Objectives?

Program objectives are the general goals that define what it means to be an effective program.

Three Common Objectives

Developing students as successful professionals in the field

Developing students as effective researchers in the field

Maintaining or enhancing the overall quality of the program

What Are Outcomes?

Program outcomes are specific results the program seeks to achieve in order to attain the general goals defined in the objectives.

These can be thought of as faculty expectations of students completing the degree program

Example for Outcome 2 – Effective Researchers

2. To prepare students to conduct research effectively in XXXX in a collaborative environment, the program aims to offer a variety of educational experiences that are designed to develop in students the ability to:

a. read and review the literature in an area of study in such a way that reveals a comprehensive understanding of the literature

b. identify research questions/problems that are pertinent to a field of study and provide a focus for making a significant contribution to the field

c. gather, organize, analyze, and report data using a conceptual framework appropriate to the research question and the field of study

d. interpret research results in a way that adds to the understanding of the field of study and relates the findings to teaching and learning in science

Objectives and Outcomes

Objectives: general, indefinite, not intended to be measured; they set the overall agenda for the program

Outcomes: specific, definite, intended to be measured; they establish the particular means by which the agenda is achieved

Critical Questions for Assessment

What are the indicators of effectiveness for our program? Objectives and Outcomes

How do we determine whether or not our program is meeting the outcomes? Outcomes Assessment Plan

How effective is our program in terms of the outcomes? Outcomes Assessment

What does our assessment suggest for improving the program? Continuous and Ongoing Improvement

Four Questions for Creating an Assessment Plan

1. What types of data should we gather for assessing outcomes?

2. What are the sources of the data?3. How often are the data to be

collected?4. When do we analyze and report the

data?

Types of Data Used

1. Take advantage of what you are already doing

Preliminary exams Proposals Theses and dissertations Defenses Student progress reports Student course evaluations Faculty activity reports Student exit interviews

Types of Data Used

2. Use Resources of Graduate School and Your Institutional Analysis Group

Enrollment statistics Time-to-degree statistics Student exit data Ten-year profile reports Alumni surveys

Types of Data Used

Use your imagination to find other kinds of data Dollar amount of support for faculty Student cv’s Faculty surveys

Data: Two Standards to Use in Identifying Data

1. Appropriateness: Data should provide information that is suitable for assessing the outcome

2. Accessibility: Data should be reasonable to attain (time, effort, ability, availability, resources)

Four Questions for Creating an Assessment Plan

1. What data should we gather for assessing outcomes?

2. What are the sources of the data?3. How often are the data to be

collected?4. When do we analyze and report the

data?

Sources of Data

Students Faculty Graduate School Graduate Program Directors Department Heads Registration and Records University Information Technology University Planning and Analysis

Four Questions for Creating an Assessment Plan

1. What data should we gather for assessing outcomes?

2. What are the sources of the data?3. How often are the data to be

collected?4. When do we analyze and report the

data?

Frequency of Data Collection

Every semester Annually Biennially When available from individual graduate

students At the preliminary exam At the defense At graduation

Ordering Outcomes for Assessment

More pressing outcomes earlier and less pressing ones later

Outcomes easier to assess earlier and outcomes requiring more complex data gathering and analysis later

Approximately the same workload each year of the assessment cycle

Four Questions for Creating an Assessment Plan

1. What data should we gather for assessing outcomes?

2. What are the sources of the data?3. How often are the data to be

collected?4. When do we analyze and report the

data?

Creating an Assessment Timetable

Standard practice appears to be to call for a short annual or biennial assessment report

Longer cycles loose the impact on the continuous and ongoing nature

When possible combining with an external review program i.e. including assessment reports as part of the self study is recommended

Four Questions for Creating an Assessment Plan

1. What data should we gather for assessing outcomes?

2. What are the sources of the data?3. How often are the data to be

collected?4. When do we analyze and report the

data?

Discussion Questions?

How many of your institutions have an outcome-based graduate program review process?How many of you are considering implementing such a review program?What are some of the variations that exist?How often are your assessment reports due?

Discussion Questions? continued

For those of you with an outcome-based review process, or for that matter, for those of you considering implementing such a process what was (is) the driving force in that decision?What has been the level of campus buy-in?

Case Studies

Graduate Program Review At: Southern Illinois University Carbondale North Carolina State University

Impact of State Mandates on SIUC

Illinois Board of Higher Education’s Priorities, Quality, and Productivity (P.Q.P.) Initiative (1992)

IBHE’s Framework for Reviewing Priorities, Productivity, and Accountability (PPA) in Illinois Higher Education (December 2003)

P.Q.P. in Illinois

Statewide Productivity Assessment of Graduate Programs

1. Capacity in Relation to Student Demand2. Capacity in Relation to Occupational

Demand3. Centrality in Relation to Instructional

Mission4. Success of Graduates5. Program Costs

P.Q.P. Guidelines for Elimination of Degree Programs (IBHE, August 1992)

Institutions Should Consider Eliminating Programs:

Whose credit hours, enrollments, and degree production significantly deviate from statewide or institutional averages …

In fields … in which projected statewide job openings are low or projected to slow

In fields that enroll a relatively small proportion of non-majors …

Elimination guidelines cont.

That have been found to have quality deficiencies based upon their most recent program reviews

That exhibit low job placement rates, lack of student and alumni satisfaction and support, and low graduate admissions of pass rates on licensure exams

Whose costs significantly deviate from statewide avg. expenditures per FTE in the discipline …

Role of IBHE, University Boards of Trustees and Campus … in Program Reviews (IBHE, Nov. 1994)

IBHE “has statutory authority to review public institution instructional programs and to communicate to governing boards any programs that the Board finds to be educationally and economically not justified

Univ. Boards of Trustees have statutory authority to eliminate academic programs

Each Univ. has a Mission Statement which “sets forth the campus’ values and aspirations”

Role of IBHE … cont.

Each Univ. has a Focus Statement that “describes the distinctive strengths and contributions of each of the 12 public universities to Illinois higher education”

Each … must annually produce a Priorities Statement which “should guide decisions to allocate current funds and develop new programs and budget requests.” These statements are to be included in each universities annual Resource Allocation and Management Plan (RAMP)

Role of IBHE … cont.

“To integrate information from program review into campus governing board, and state-wide decision making in the P.Q.P. initiative, the public university academic program review process was revised (by IBHE) FOR 1993-94.”

The revised program review process requires that the 12 public universities Submit Reviews of Similar Programs the same year in an eight-year cycle and for IBHE staff to identify issues to be addressed in a Statewide Analysis Prior to Campus Reviews.

Elements of the Illinois Statewide Program Review Process (IBHE RAMP Manual, 1993)

The IBHE review schedule assures the “submission of reviews of similar programs by all (twelve public) universities at the same time.”

Prior to the review, an IBHE statewide analysis, coordinated with the review schedule, “defines statewide issues, examines capacity in fields of study across universities, and provides comparative information for institutional reviews of individual programs.”

Elements of Program Review cont.

Illinois universities conduct the program reviews “according to campus-developed procedures and submit the results of reviews to the IBHE.”

IBHE staff analyze the program review reports and provide “recommendations on the educational and economical justification of selected programs … in the staff’s annual Priorities, Quality and Productivity (P.Q.P.) report.”

Universities must follow the coordinated review schedule but “may conduct reviews within a reasonable period (e.g. up to 3 years) prior to submission date in order to coordinate reviews with accreditation and other evaluations.”

Elements of Statewide Analysis for Each Program Area to be Reviewed (RAMP Manual, 1993)

1. Trends in enrollment and degrees granted2. Student characteristics3. Program costs4. Occupational demand5. Recommendations for expansion or

reduction of programs on a statewide basis Universities will be asked to respond to the

elements of the statewide analysis in their program reviews

Elements of the Program Review Reports to the IBHE (RAMP Manual, 1993)

A 1 to 2 page summary of the review, submitted by July 1st of each year, “should address the following questions and the key findings and recommendations in each of these areas should constitute its substance”:

1. Student demand2. Occupational demand3. Centrality to instructional mission4. Breadth5. Success of graduates6. Costs7. Quality8. Productivity

P.Q.P. Impact on SIUC Graduate Programs

Graduate Council from 1993-96 examined all graduate programs on campus, using data supplied by IBHE and also generated by the Graduate School

Programs scheduled for elimination or substantial reduction had the opportunity to respond to recommendations in a series of Graduate Council meetings and special forums

P.Q.P. Impact cont.

7 Ph.D. programs were eliminated or consolidated (Communication Disorders; Higher Ed.; Molecular Science; Physical Ed.; Special Ed.; Geology; and Geography

Eventually, Geography (Liberal Arts), Geology (Science), and Agribusiness Economics (Agriculture) created a new interdisciplinary Ph.D. program in Environmental Resources and Policy within the Graduate School

5 master’s programs and several post baccalaureate specializations were eliminated

Most of these eliminations occurred outside of regular review process, though some were informed by that process

SIUC Graduate Council Response

The P.Q.P. initiative illuminated flaws in the review process and especially problems with inconsistent data about programs

GC Program Review Committee took a hard look at the problems and suggested substantial changes in the campus review process

In 1999, a new review format was put in place (see handout)

Problems with IBHE Statewide Review of Similar Programs

Clearly, IBHE through P.Q.P. hoped to use the statewide review of similar programs as a vehicle to reduce or eliminate programs seen as duplicative, too costly, unproductive, etc.

At first, IBHE wanted to appoint the reviewers—proved to be impractical

Accreditation cycles did not march in sync IBHE went through several attempts to revise

the review cycle mandated in P.Q.P.

IBHE Review cont….

IBHE creates the Illinois Commitment in February 1999, including including the stipulation that “By 2004, all academic programs will systematically assess student learning and use assessment results to improve programs.”

IBHE “Redesign of Public Institution Academic Program Approval and Review Processes” (April 2002); see handout, especially p. 38

SIUC had created an annual “outcomes-based” assessment requirement for all programs in 1999, in part, because of the North Central accreditation process; see http://www.siu.edu/~assessment/

IBHE … Priorities, Productivity, and Accountability (PPA, December 2003)

“Illinois’ system of higher education must have a clear sense of its priorities, ensure the efficient and productive use of existing resources, and demonstrate public accountability before seeking additional assistance from the taxpayer and student”

IBHE PPA in Action

IBHE board chair indicates that focus of PPA will be on faculty productivity at the 12 publics and 50+ community colleges without regard for mission or focus of the institution

12 Publics indicate that this approach is not practical given the nature of their differing missions and focuses and board chair backs off

Two subcommittees formed to begin the process

PPA cont.

PPA in Action, cont. (Committee Minutes, May 25, 2004)

Subcommittee A is focusing on “mission/focus statements, program approval processes, and more qualitative issues,” including:

1. Reviewing “statewide enrollment by program (all degree levels),new programs, degrees awarded, programs discontinued

2. Considering Impact of Technology on faculty work

PPA in Action, cont.

Faculty review includes

1. “Faculty roles in terms of: hours per week of formal class, preparation, conferences; supervising remedial or advanced work; keeping up with discipline; course design,…”

2. “Measure of work (Quantitative): contact hours, release and/or assigned time, class size”

3. “Promotion/Tenure: including research, teaching, service”

PPA in Action, cont.

Subcommittee B is focusing on “state-level accountability mechanisms and processes”

The subcommittees continue to meet and no recommendations have been made as of this date

The saga continues …

Other Factors that Currently Impact or Inform the Review/Assessment Process at SIUC

Southern at 150: Building Excellence Through Commitment (2002; see http://news.siu.edu/s150/)

“The goal … is to articulate a series of commitments and actions that will place us among the top 75 public research universities in the United States by the year 2019, our 150th anniversary, while we continue to provide the foundation of academic, economic, and social progress in Southern Illinois”

Southern at 150, cont.

Commits to offering a “Progressive Graduate Education” and to “Lead in Research, Scholarly, and Creative Activity”

“By 2019, 25 percent of our total enrollment will be graduate students” (increasing from approximately 4,000 to 6,000 graduate students)

Southern at 150, cont.

“Research and scholarship will be integrated into every decision made on campus. Building a culture where research becomes an integral part of all undergraduate and graduate programs is essential. Substantially enhance research and scholarly productivity.”

Southern at 150, cont.

“A Review of the Research Enterprise” at SIUC, Washington Advisory Group (July 2003)

1. Focused on Sciences, Engineering, and School of Medicine

2. This review looked at all programs in these areas and recommended strategies.

Southern at 150, cont.

“Research and Scholarship in the Arts, Humanities, and Social Sciences at Southern Illinois University Carbondale,” (Consultant Team Report, October 2004)

SIUC’s “Faculty Hiring Initiative,” a long-term commitment of recurring resources each each year to meet the strategic goals set by Southern at 150. Reviews/assessment play a key role in this program

Graduate Program Review at NC State – External Review

Current Process: Administration Administered by the Dean of the

Graduate School Initiated by program or Graduate School Often at the Department level

Includes multiple degrees/programs Partner with College and/or accreditation

reviews

Graduate Program Review at NC State – External Review

Current Process: Objectives Reviews are conducted to gain a clearer

understanding of a program’s: Purpose(s) within NC State Effectiveness in achieving purposes Overall quality Future objectives Changes needed to achieve objectives

Graduate Program Review at NC State – External Review

Current Process: Operational Procedures 10 year review cycle Components

Internal self-study External “team” review

Review team report – oral & written Program response prepared Administrative Meeting

Graduate Dean, Provost, Vice-Chancellor for Research, College Administration, Department Head, Director of Graduate Programs, Review Team Chair

Graduate Program Review at NC State – External Review

Current Process: Information Made Available Last program review report & response 5 year graduate program profile (updated annually)

Enrollment: numbers, demographics Applications

Numbers applied/admitted/enrolled Quality indicators

Number of degrees awarded, time to degree Financial support Exit interviews

All thesis and dissertation students

Questions We Began to Ask Ourselves?

Do each of our degree programs have clearly defined outcome goals?

Are they measurable or observable? Do we obtain data to assess the achievement

of degree program outcomes? Do we use assessment results to improve

programs? Do we document that we use assessment

results to improve programs?

Motivation For Change

Growing culture of program improvement on our campus – both undergraduate and graduate

Undergraduate Student Affairs had implemented an outcome-based review program that was now operational

SACS was just around the corner

Ultimate Question for NC State Became

How could we create a hybrid that evaluated program quality and measured student learning? Accomplish administrative goals regarding

evaluation of quality related to funding and institutional goals

Accomplish graduate school goals related to program improvement

The ultimate goal is to improve educational programs, not fill out reports to demonstrate accountability

Studying & Revising the Process

Graduate Dean Appointed a Task Force Made up of “stakeholders” Relied on on-campus expertise

Focus groups with administrators, faculty, students, etc.

Could not utilize Undergraduate Program Review personnel Work load issue New perspectives

Bottom Line – The opportunity for change is at the faculty level, so we want the process to address improvement at that level.

Graduate Program Review at NC State

Task Force Goals: Study/revise the existing process Evaluate purpose and goals of review Examine current protocols, especially with respect to:

Continuous and ongoing expectation Outcomes-based assessment requirement What should the role of the Graduate School and

it’s Administrative Board be? How can the outcome of a review be most

effective? What infrastructure is necessary to operate the

process?

Graduate Program Review at NC State

Task Force Key Findings: The current process is fairly typical

Graduate program reviews typically are conducted on a 6- to 10-year cycle

The current process follows Council of Graduate School guidelines (as outlined previously)

An external review component should be continued

Greater emphasis should be placed on student learning outcomes

Graduate Program Review at NC State

Task Force Key Findings – continued: The revised process should be more

continuous and ongoing The review process should result in

appropriate follow-up Current resources do not allow review

of all graduate programs on a 10-year cycle

What We Decided to Do

Continue the traditional external review program on an 8 year schedule

Continue to partner with external reviews already conducted for accreditation or other purposes

Emphasize development of program specific student learning outcomes and assessment procedures to determine if they are being achieve

What We Decided to Do

In addition to the External Program Review we will require each program to: Develop program specific objectives and

outcomes Develop an assessment plan outlining the

assessment activity(s) they will conduct annually

Complete a biennial assessment report that is submitted on-line

What We Decided to Do

Provide the training necessary for programs to implement these changes Development of objectives, outcomes,

assessment plans Partner with University Planning and Analysis

and other campus units to improve utility of centralized data collection and processing Assist in data collection for assessment plans at

the institutional level

What We Decided to Do

Increase efforts relative to follow-up after the graduate program review – assess progress on recommendations Tie the annual assessment and biennial

reports to the external review by incorporating the changes made as a result of assessment into the self-study

Development of an “Action Plan” Agreed upon by University, Graduate School,

College and Department Administration

Revised Review Process

Initial Year 1(Start-Up)

 •Development of objectives, outcomes and assessment tools

•Identification of data sources and beginning of data collection

Cycle Year 3 (also 5 and 7)

 Continued data collection pertinent to outcomes and assessment measures 

CompactInitiatives

Cycle Year 2(also 4 and 6)

•Ongoing assessment & self-study by grad faculty •Programmatic changes

•Brief biennial assessment report

 

Cycle Year 8(program review) •Self-study report

•External review

•Review report

•Program response

• Action plan 

         

    

 

Training Workshops Provided

Workshops provide the training necessary for programs to implement:Graduate Program Review – Where we are, Where we are headed, and why?Assessing the Mission of Doctoral Research Universities

A workshop on outcomes based assessment put on by outside experts

Creating Outcomes and ObjectivesCreating and Assessment PlanUtilizing the Graduate School Managerial ToolsDeveloping an Institutional Database for Assessment of Graduate Programs – to be developed

Managerial Tools Created for Program Review - Website

Managerial Tools Created for Program Review - Website

Managerial Tools Created for Program Review - Website

Managerial Tools Created for Program Review - Website

Managerial Tools Created for Program Review - Profile Data

Managerial Tools Created for Program Review - Profile Data

Managerial Tools Created for Program Review - Profile Data

Managerial Tools Created for Program Review – Review Document Management

Managerial Tools Created for Program Review – Review Document Management

Managerial Tools Created for Program Review – Review Document Management

Managerial Tools Created for Program Review – Review Document Management

What We Have Learned/ Discussion Points

The process of change takes time We have been at this for going on three year (since

the start of the Task Force) and have not collected the first biennial report

Communication is the key to success Clearly communicated goals and expectations are

important Flexibility – faculty in many programs on our

campus prefer to the word “faculty expectations” to “outcomes” – so be it

What We Have Learned/ Discussion Points continued

This kind of change has to be from the ground (faculty) up not top (administration) down Even then faculty are very skeptical about work

loads verses value This type of a review process requires

significant human resources Training, data collection, analysis, and

interpretation, etc. A key to our success will e how much of this can be

institutionalized


Recommended