+ All Categories
Home > Documents > Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations...

Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations...

Date post: 22-Jul-2020
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
33
San Jose City College Accreditation Gap Analysis and Recommendations January 12, 2012 Matthew C. Lee, Ph.D.
Transcript
Page 1: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

San Jose City College

Accreditation Gap Analysis and Recommendations

January 12, 2012

Matthew C. Lee, Ph.D.

Page 2: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Table of Contents

Preamble .............................................................................................................. 3

Background: Accreditation Sanctions ................................................................... 4

Review and Analysis Process ............................................................................... 4

College Responses to Team Recommendations .................................................. 6

General Observations ...................................................................................... 6

College Recommendation 1: Integrated Planning and Assessment ................ 9

College Recommendation 2: Program Review .............................................. 17

College Recommendation 3 and Commission Reminder: Student Learning Outcomes ................................................................................................. 25

Page 3: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 3 of 33

San Jose City College Accreditation Gap Analysis and Recommendations

Matthew C. Lee, Ph.D. January 12, 2012

Preamble

At its January 2011 meeting, the Accrediting Commission for Community and Junior

Colleges (ACCJC) placed San Jose City College on Probation, and required a Follow-Up

Report in October 2011 to demonstrate resolution of six District-level Recommendations,

six College-level Recommendations, and three Commission Concerns. An evaluation

team visited campus in early November 2011 to verify the College’s progress, and issued

its own report in December. The team’s report indicated that the College had made

substantial progress on several fronts, but had not yet fully resolved Recommendations in

the following areas:

District Recommendation 1: Research and evaluation capacity

District Recommendation 5: Board ethics policy and a schedule for review of all

Board policies

District Recommendation 6: Delineation of District and College functions, and

evaluation and improvement of District governance and functional effectiveness

College Recommendation 1: Integrated planning and assessment

College Recommendation 2: Program review

College Recommendation 3: Student learning outcomes

College Recommendation 5: Completion of outstanding faculty and adjunct

evaluations, and engagement in student learning improvement as a component of

evaluations

College Recommendation 6: Understanding and effectiveness of delineation of

District and College responsibilities

The Commission will issue its decision regarding the College’s accreditation status in late

January or early February 2012.

To expedite progress in addressing College Recommendations 1-3, the College

contracted with me to evaluate the College’s progress to date and make recommendations

on the direction of work needed to reach full resolution. The primary purpose of this

report is to summarize my findings and identify actions that the College can take to close

the gap between where it is now and where it needs to be, in terms that are more concrete

and detailed than those in Recommendations 1-3. The aim is not just to resolve those

Recommendations and gain reaffirmation of accreditation—to “get out of trouble”—but

more importantly, to improve the effectiveness of San Jose City College permanently.

The consultant recommendations listed in each section are designed to facilitate planning

and implementation of lasting, positive change.

Beneath each Recommendation, I have reproduced the ACCJC Standards to which it

refers. To formulate and execute the most productive responses to each

Recommendation, the College needs to understand those Standards as well as the

language of the Recommendation itself.

Page 4: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 4 of 33

Finally, it is the nature of a report such as this to focus more on the specific steps that still

need to be taken than on what is already in good shape or well underway. Consequently,

readers will not see as much coverage of the many positive aspects of the College as

might appear in, say, a history of the institution. I urge readers to view the report not as a

source of discouragement, but rather as a call to action, to move toward a brighter future

for the College, its personnel, and most importantly, its students.

Background: Accreditation Sanctions

As noted in the Commission’s Accreditation Reference Handbook, Probation is a stronger

sanction than Warning. It indicates that the institution actually “deviates significantly

from the Commission's Eligibility Requirements, Accreditation Standards, or

Commission policies, but not to such an extent as to warrant a Show Cause order or the

termination of accreditation.” Show Cause, the strongest sanction short of termination, is

a Commission order to the institution to “show cause why its accreditation should be

continued.” A college under any of these three sanctions does retain its accreditation.

The final sanction, in the absence of sufficient corrective action, is Termination of

accreditation. U.S. Department of Education rules require termination if an institution

fails to come into compliance with Accreditation Standards within a two-year period,

though the Commission may grant an extension of that deadline for good cause.

It is important to note that the Commission’s policies do not require it to follow the

sequence of steps from Warning through Probation and Show Cause to Termination. It

has the ability to terminate accreditation at any time if it concludes that the institution is

significantly out of compliance with the Standards or the Eligibility Requirements.

I want to be very clear about all these Commission sanctions, not to frighten anyone, but

to highlight the severe consequences of inadequate action, to call attention to the fact that

the clock is ticking, and to convey a sense of urgency to the College community. As the

reader will see below, San Jose City College has made significant progress since the

team’s initial visit, and some progress since its follow-up visit, but much more work is

needed before the College is back in the Commission’s good graces with respect to

Recommendations 1-3.

Review and Analysis Process

This report is based in part on my review and analysis of well over 200 documents,

including the following:

2010 Accreditation Self-Study

2010 Evaluation Report and January 31, 2011 Commission Action Letter

Spring 2011 ACCJC Annual Report

District and College foundational statements, including the Mission, Vision, and

Values

Board policies related to Recommendations 1-3

Page 5: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 5 of 33

Research documentation (e.g., program review data in Reporting Portal sample

output; Administrator Turnover Survey, July 18, 2011)

Documentation of Professional Development Day activities

Selected meeting minutes and/or other documentation of the work of the College

Planning Council, Finance Committee, Facilities Committee, Campus Technology

Committee, Research Power Users Group, and other groups

The educational and facilities master plan, technology master plan, and selected

other institutional planning documents

Samples of documents related to planning and program review at each

organizational level

Documentation of selected resource allocation requests and priorities from the

most recent available period

A sample of Course Outlines of Record and other curricular materials

Documentation of selected course, program, and institutional student learning

outcomes (SLOs), and of service outcomes and objectives

Samples of course, program, and institutional SLO assessments

Documentation of applicable communication, dissemination, and discussion

mechanisms

Collective bargaining contracts

Documentation of policies, structures, and processes associated with all the items

listed above

In addition, I conducted structured interviews with the following people during a two-day

visit to campus:

Nicholas Akinkouye Celia Cruz Barbara Kavalier

Oleg Bespalov Rebecca Gamez Leandra Martin

Elaine Burns Romero Jalomo Dorothy Pucay

Mary Conroy

I also participated in one Accreditation Oversight Task Force meeting and one

President’s Cabinet meeting.

The findings in this report thus rest on a substantial amount of evidence, and I am

confident that they accurately reflect that evidence. However, I have not read every

possible document, nor have I interviewed every employee and student. For example, I

was unable to find minutes or other documentation of meeting deliberations for the

current year, or even for last year, for many key campus groups on Infostore or the SJCC

public website. To the extent that the information I have analyzed is not sufficiently

comprehensive, or not entirely representative of the College’s structures, processes, and

issues, it is possible that my findings in some particulars might be subject to revision. Of

course, it is up to the President and the College to decide what weight to give those

findings, and how best to respond to my recommendations.

Page 6: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 6 of 33

College Responses to Team Recommendations

General Observations

This section contains observations that apply to resolving multiple Commission

Recommendations and/or improving institutional effectiveness overall.

1) Probation is a very serious Commission sanction that requires immediate and

sustained corrective action involving the entire campus community. Gaining full

reaffirmation of accreditation is the business of everyone at the College, not just the

relatively small subset of people who, as at every community college, typically

perform most committee service. All employees need to recognize the gravity of this

situation, and a greater proportion need to step up and offer constructive engagement

and active assistance in moving the College forward, especially given the fact that the

College has less than one year before the Commission’s two-year deadline for full

resolution of all Recommendations. See Consultant Recommendations under College

Recommendation 1 below.

2) The issues identified by the visiting team are of long standing at the College. For

example, the Commission placed SJCC on Warning in January 2005 on many of the

grounds it cited again in January 2011. Even though the Commission removed the

institution from Warning in January 2006, it clearly became concerned, after further

progress reports in 2007 and 2008, that the same issues persisted. The College has

made many attempts to resolve these problems, but to date all have failed to produce

lasting improvements. Under new leadership, the College now has the proverbial

golden opportunity to move forward in the right direction on a permanent basis.

3) Judging from the minutes I sampled, crucial committee work too often proceeds at a

snail’s pace or is otherwise ineffective. Some issues and ideas surface and resurface

repeatedly, eluding resolution and disposition for months on end. Although I have no

direct evidence, I suspect that, as at many other community colleges, committee

members and conveners are often dropped into their roles with minimal preparation,

presumably in the expectation that they will magically understand how to be effective

in those roles. All too often, in my experience, that is not the case. Effective

committee leadership and participation are skills that can and should be taught.

4) One of the Shared Recommendations emphasized the necessity of improving the

research capacity available to the College. Integrated planning, program review, and

outcomes assessment all require reliable sources of accurate data, as well as resources

to help users analyze and interpret those data, so the College’s resolution of all three

Recommendations covered in this report depends on the District’s resolution of

Shared Recommendation 1. See in particular Consultant Recommendations under

College Recommendation 2 below.

5) The Accreditation Oversight Task Force has done yeoman service in leading efforts

to resolve the Recommendations, but there is no standing Accreditation Committee at

the College. I strongly urge the institution to constitute such a committee, to help

ensure that the College maintains its awareness of and adherence to the ACCJC

Standards on an ongoing basis, not just during preparation of the self-evaluation

every six years.

Page 7: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 7 of 33

6) Governance and Climate

a) In the San Jose-Evergreen District, the Board of Trustees—and the Chancellor

and College Presidents as designees—rely primarily on the Academic Senate in

all academic and professional matters except two. Mutual agreement applies to

processes for program review and institutional planning and budget development.

Academic Senate leaders and members have contributed substantially to College

efforts to respond to all the Commission’s College Recommendations, including

those concerning integrated planning, program review, and outcomes.

b) The Board and the President have provided the management team, classified staff,

and students with opportunities to participate effectively in governance in those

areas that have a significant effect on them, as required by Title 5. Consequently,

leaders and members of those groups have also contributed substantially to

College efforts to respond to the Recommendations.

c) Overall, the College enjoys a knowledgeable and caring set of faculty, staff, and

managers, clearly dedicated to helping students succeed. However, that

dedication and caring, based on my limited observations, seem to be focused

primarily on the needs of one’s own students and/or one’s own circle of office or

departmental colleagues, and in many cases do not appear to extend to the needs

of the institution as a whole.

d) Problems in institutional governance and decision-making have characterized the

College for a long time, and the effects still ripple through its culture at every

turn, judging from the comments of many interviewees and meeting participants.

In a small Spring 2011 pilot survey on governance and decision-making, there

was overall disagreement that “structures, processes and practices that facilitate

effective communication among the constituency groups” existed, and an even

split on whether “decision-making processes at the college are clearly

communicated and understood.” Longstanding tensions reportedly still exist

among constituency groups, and among unions, constituency groups, and

management. Administrative turnover has been inordinate; more than one

manager with new initiatives in hand has met with reactions to the following

effect: “You’re not going to stay, so why should we bother?” Reorganizations

have reportedly undercut traditionally useful lines of communication.

Governance participation is reportedly less pervasive than it ought to be, and the

“usual suspects” populate most committees.

e) Perhaps not surprisingly, the overall campus climate appears to have suffered for

a long time, too. There are not as many opportunities for employees to have

social interaction and sheer fun as there once were, and public recognition for

good work is reportedly scarce. Of course, the campus climates of many

California community colleges have been hurt by the economic stresses of the last

several years, but my perception, again based on limited observations, is that the

malaise at SJCC has other origins.

f) On the other hand, significant positive movement appears to have occurred over

the last year or so, which most people reportedly attribute in part to the arrival and

actions of the new President. In the Spring 2011 survey, there was substantial

agreement that “college leaders encourage members of the college to take the

initiative in improving institutional effectiveness” and that respondents were

aware of the role their constituency groups play in shared governance. A small

group of faculty, classified, and administrators is meeting periodically with the

Page 8: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 8 of 33

President to identify ongoing informal activities that will promote a more positive,

collaborative climate. All managers are reportedly participating in the Genuine

Leadership Program to help them become more effective leaders. It is

counterproductive for any organization to dwell on past wrongs, and crucial to

move forward when the opportunity arises. These changes give cause for hope,

and the College must nurture that sense of hope whenever and wherever it takes

root.

Page 9: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 9 of 33

College Recommendation 1: Integrated Planning and Assessment

In order to meet the standards, the college must complete, implement, and make public a

mission-based strategic plan that includes quantifiable goals and measures of student

achievement; complete and implement a regular cycle of review, including and using

information and data to update plans on a regular basis, and evaluate progress towards

strategic plan goals that use the results of the evaluation for improvement. Student

learning outcomes and program review for all departments should be integrated into the

planning process and cycle at the college, program, and course levels, and planning and

program review should inform resource allocation. Finally, measurable student

achievement and institutional effectiveness measures should be identified and

incorporated within planning processes. (IA.4, IB.1, IB.2, IB.3, IB.4, IB.5, IB.6 and IB.7,

IIIB.2, IIIB.2.b, IIIC.2, IIID.1, IIID.1a, IIID.1d, IIID.3, IVA.2a, IVA.2b, IVA.3)

IA.4. The institution’s mission is central to institutional planning and decision making.

IB.1. The institution maintains an ongoing, collegial, self-reflective dialogue about the continuous

improvement of student learning and institutional processes.

IB.2. The institution sets goals to improve its effectiveness consistent with its stated purposes. The

institution articulates its goals and states the objectives derived from them in measurable terms

so that the degree to which they are achieved can be determined and widely discussed. The

institutional members understand these goals and work collaboratively toward their

achievement.

IB.3. The institution assesses progress toward achieving its stated goals and makes decisions

regarding the improvement of institutional effectiveness in an ongoing and systematic cycle of

evaluation, integrated planning, resource allocation, implementation, and re-evaluation.

Evaluation is based on analyses of both quantitative and qualitative data.

IB.4. The institution provides evidence that the planning process is broad-based, offers opportunities

for input by appropriate constituencies, allocates necessary resources, and leads to improvement

of institutional effectiveness.

IB.5. The institution uses documented assessment results to communicate matters of quality assurance

to appropriate constituencies.

IB.6. The institution assures the effectiveness of its ongoing planning and resource allocation

processes by systematically reviewing and modifying, as appropriate, all parts of the cycle,

including institutional and other research efforts.

IB.7. The institution assesses its evaluation mechanisms through a systematic review of their

effectiveness in improving instructional programs, student support services, and library and

other learning support services.

IIIB.2. To assure the feasibility and effectiveness of physical resources in supporting institutional

programs and services, the institution plans and evaluates its facilities and equipment on a

regular basis, taking utilization and other relevant data into account.

IIIB.2.b. Physical resource planning is integrated with institutional planning. The institution

systematically assesses the effective use of physical resources and uses the results of the

evaluation as the basis for improvement.

IIIC.2. Technology planning is integrated with institutional planning. The institution systematically

assesses the effective use of technology resources and uses the results of evaluation as the basis

for improvement.

IIID.1. The institution relies upon its mission and goals as the foundation for financial planning.

IIID.1.a. Financial planning is integrated with and supports all institutional planning.

IIID.1.d. The institution clearly defines and follows its guidelines and processes for financial planning and

budget development, with all constituencies having appropriate opportunities to participate in the

development of institutional plans and budgets.

IIID.3. The institution systematically assesses the effective use of financial resources and uses the

results of the evaluation as the basis for improvement.

Page 10: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 10 of 33

IVA.2.a. Faculty and administrators have a substantive and clearly defined role in institutional governance

and exercise a substantial voice in institutional policies, planning, and budget that relate to their

areas of responsibility and expertise. Students and staff also have established mechanisms or

organizations for providing input into institutional decisions.

IVA.2.b. The institution relies on faculty, its academic senate or other appropriate faculty structures, the

curriculum committee, and academic administrators for recommendations about student learning

programs and services.

IVA.3. Through established governance structures, processes, and practices, the governing board,

administrators, faculty, staff, and students work together for the good of the institution. These

processes facilitate discussion of ideas and effective communication among the institution’s

constituencies.

ER19. Institutional Planning and Evaluation: The institution systematically evaluates and makes public

how well and in what ways it is accomplishing its purposes, including assessment of student

learning outcomes. The institution provides evidence of planning for improvement of

institutional structures and processes, student achievement of educational goals, and student

learning. The institution assesses progress toward achieving its stated goals and makes decisions

regarding improvement through an ongoing and systematic cycle of evaluation, integrated

planning, resource allocation, implementation, and re-evaluation. [Note: Eligibility Requirement

19 is not cited in this Recommendation, but the evaluation team, in its initial evaluation report,

noted that the College only “partially met” this fundamental requirement.]

Observations: Progress to Date

1) In Spring 2011, the Academic Senate President, the Accreditation Oversight Task

Force, and the President agreed that the work required to resolve this

Recommendation needed to begin immediately, so the President formed an ad hoc

Strategic Planning Team (SPT) to expedite that work. (The College reportedly

recognizes that a permanent strategic planning committee is needed, but has not yet

established one; see below.) SPT included representatives from administration and

faculty, and later from classified staff as well. It drafted a Strategic Plan that

explicitly incorporated the College’s Mission, Vision, and Values, as well as its

Strategic Goals. The Plan also required the collection and analysis of quantitative and

qualitative evidence to assess the College’s impact on community needs and to

evaluate its effectiveness as demonstrated in program review, SLO assessment, and

application of a set of measurable Performance Indicators tied to the Goals.

Moreover, the Plan required alignment of planning and resource allocation,

evaluation of the planning process, and integration with other major planning

processes and structures, all of which were reflected in an annual strategic planning

timeline. Finally, it presented a resource allocation model that included program

review, outcomes assessment, administrative review, and input by numerous shared-

governance committees. SPT disseminated the draft Plan widely, collected feedback,

and made changes based on that feedback. SPT presented the final version of the

Plan at Professional Development Day in April 2011.

2) Also in Spring 2011, the College piloted the revised resource allocation process,

which reportedly operated smoothly and produced recommendations to the President

for faculty positions and budget augmentations. Beginning this year, any program

requesting resources through this process must have submitted its latest

comprehensive or annual program review documents.

3) The Academic Senate reportedly relied on program review submissions during its

Spring 2011 deliberations on faculty hiring priorities.

Page 11: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 11 of 33

4) During Summer and Fall 2011, the Research Power Users Group (PUG), composed

of faculty and administrative representatives, developed definitions for Strategic

Planning Performance Indicators under each of the College Strategic Goals, gathered

some status data related to most Indicators, and recommended benchmarks for most

of the status data categories.

5) Program review forms in both instructional and student services areas require

consideration of program outcomes and their assessment, the program’s contribution

to the College’s Mission, and how the program integrates the College Strategic Goals

and/or Institutional SLOs (ISLOs). The template proposed for Administrative

Services in Summer 2011 requires consideration of program outcomes and

assessment and of the program’s contribution to the College’s Mission. All templates

require the application of both qualitative and quantitative data.

6) Communication

a) Information about integrated planning has been conveyed to the campus

community over the last two years through the College Planning and

Accreditation pages of the SJCC website, presentations on at least the last three

Professional Development Days, newsletters and progress reports published two

to three times per year, and meetings of the Accreditation Oversight Task Force,

College Planning Council (CPC), and other shared-governance bodies.

b) The College has chosen to use a shared internal directory called Infostore as the

centralized electronic repository of most committee minutes and other documents

related to integrated planning. It is accessible from any campus computer to any

user with appropriate sign-on credentials. In a few cases (e.g., PUG), minutes are

accessible from the SJCC website instead.

c) According to the Self-Study, the Vice President for Administrative Services

(VPAS) used to hold campus forums on recommended resource allocation

priorities after they had been reviewed by the Vice Presidents. Prior to Fall 2011,

it had been several years since the College had had a VPAS position, and at this

remove it is unclear whether that process was designed to gather input that might

result in changes, or merely to share the recommendations. It is also unclear

whether the new VPAS will hold such forums in the interests of transparency.

Observations: Issues Requiring Action

1) Accreditation Requirements

a) Like all community colleges operating under ACCJC Standards, San Jose City

College was supposed to have achieved the highest level on the Commission’s

Rubric for Evaluating Institutional Effectiveness—Part II: Planning, Sustainable

Continuous Quality Improvement (SCQI), over four years ago. The team found

that as of less than three months ago, the College had not mastered even the

Proficiency level, which has the following characteristics:

The college has a well-documented, ongoing process for evaluating itself in

all areas of operation, analyzing and publishing the results and planning and

implementing improvements.

The institution’s component plans are integrated into a comprehensive plan to

achieve broad educational purposes and improve institutional effectiveness.

Page 12: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 12 of 33

The institution effectively uses its human, physical, technology, and financial

resources to achieve its broad educational purposes, including stated student

learning outcomes.

The college has documented assessment results and communicated matters of

quality assurance to appropriate constituencies (documents data and analysis

of achievement of its educational mission).

The institution assesses progress toward achieving its education goals over

time (uses longitudinal data and analyses).

The institution plans and effectively incorporates results of program review in

all areas of educational services: instruction, support services, library and

learning resources.

The SCQI level is even more demanding:

The institution uses ongoing and systematic evaluation and planning to refine

its key processes and improve student learning.

There is dialogue about institutional effectiveness that is ongoing, robust and

pervasive; data and analyses are widely distributed and used throughout the

institution.

There is ongoing review and adaptation of evaluation and planning processes.

There is consistent and continuous commitment to improving student

learning; and educational effectiveness is a demonstrable priority in all

planning structures and processes.

2) The College has undertaken institution-wide planning efforts in the past, but none of

those processes became a permanent feature of the College. The reasons are no doubt

many, but based on the information I have gathered, probably include executive

instability, intergroup disunity, and a campus culture that appears to have become

cynical about new initiatives; moves very slowly, if at all, away from the status quo;

and tends to resist legitimate leadership efforts to move the College forward as

attacks on personal or collective prerogatives. This lack of institutionalization is a

major reason for the Commission’s sanctions in both 2005 and 2011. Now that San

Jose City has adopted a relatively straightforward approach to integrated strategic

planning on paper, and has piloted portions of it successfully, it must execute the

process consistently and permanently in the real world to reach the SCQI level. Such

an ambitious task will require both administrative and constituency leadership to

unify their efforts in that direction, and the College community as a whole to

recognize the importance of maintaining progress in every planning cycle, at every

level, every year.

3) SPT, which coordinated development of the Strategic Plan in Spring 2011, was an ad

hoc group, and at present there is no institutional home for evaluating and improving

the strategic planning process, though CPC is charged with reviewing progress

annually on implementation of the Strategic Plan. A recent proposal to establish a

standing Strategic Planning Committee has made no progress so far in the Academic

Senate.

4) As the team pointed out to the President, there has not yet been enough time for the

College to complete a full cycle of integrated strategic planning, which requires

evaluating progress toward goals based on solid evidence and using the results of that

evaluation to update plans and implement improvements. It is crucial for the College

to make as much progress as possible, as quickly as possible (consistent with sound

Page 13: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 13 of 33

practice), to complete the cycle no later than October 2012, the Commission’s two-

year deadline. (The Action Letter listed October 2011 as the two-year deadline in

error.)

5) The Evaluation of Planning section of the Strategic Plan focuses on the strategic

planning process per se and the Performance Indicators, and does not appear to

include evaluation of the program review process or the budget development process.

The recommended charge of the Program Review Committee (PRvC) includes annual

evaluation and recommended improvement of the program review process, and the

Finance Committee charge includes updating, but not evaluation, of the budget

development process. However, it is unclear how those two processes actually

undergo systematic evaluation and improvement. (See Consultant Recommendations

under College Recommendation 2 below.)

6) Further work is needed on the Performance Indicators developed by PUG and

approved by SPT. The scope of Indicators intended to measure College performance

on a given Strategic Goal does not always match the stated scope of the Goal. The

relationship between the Indicator definition and the associated status data and

benchmarks is too often indirect or unclear. Some Indicators defined as outcomes

(e.g., increased awareness) are measured by offerings and other inputs. Not all

Indicators have benchmarks, and a work plan and schedule for gathering, analyzing,

interpreting, and communicating Indicator data on a regular basis has evidently not

been developed. The departure of the District’s Interim Executive Director of

Research and Institutional Effectiveness (RIE), who was an active participant in

PUG, might complicate efforts to create and execute such a work plan; the President

has expressed the hope that the new Executive Director would serve on PUG.

Finally, the Performance Indicators documentation posted on the College Planning

website is inconsistent with that approved by SPT.

7) Communication and Dialogue

a) Communication of information about integrated planning and resource allocation

is not yet systematic and consistent through the College’s electronic media. For

example, the 2010-11 Budget Development link on the About SJCC website leads

to a page with old or undated information, though the College Planning website

contains links to the new resource allocation and strategic planning models. The

most recent minutes of the Finance Committee available in its section of Infostore

date to Spring 2010, though the evidence files for the Follow-Up Report

contained minutes through Spring 2011. PRvC minutes appear neither in

Infostore nor on the SJCC website, even though the Spring 2011 minutes of its

predecessor, the Program Review Workgroup, were available in the evidence

files.

b) The Commission’s Rubric for Planning calls for the institution, based on its

integrated planning and assessment results, to communicate “matters of quality

assurance to appropriate constituencies,” a quote from Standard I.B.5. In this

context, “constituencies” can be construed to include the community served by

the College. It is unclear whether the College has communicated the Strategic

Plan to the wider community.

c) Dialogue about student learning and improving education, in part through

integrated planning, reportedly does occur on campus, mostly within departments,

but systematic, regular opportunities for engaging in discussions across

departmental boundaries are limited. The College now has only two Professional

Page 14: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 14 of 33

Development Days (PDD) annually, down from four in the past, and enforcement

of mandatory attendance is reportedly weak. PDD sessions on integrated

planning, program review, outcomes assessment, and related topics have been

held, but their effectiveness in fostering dialogue that is “ongoing, robust, and

pervasive” is not clear.

d) Although the Strategic Plan and associated documents have been widely

disseminated on campus, the level of understanding of integrated planning among

members of the campus community—beyond those who have participated

directly in the strategic planning process—has evidently not yet been evaluated,

and is therefore unclear.

8) Resource Allocation Process

a) The resource allocation model in the Strategic Plan includes consideration of

resource needs in four categories: personnel, technology, equipment/supplies, and

facilities. However, the pilot implementation of the process dealt only with

faculty hiring and budget augmentations for supplies and equipment (including

some computers and peripherals). The model has not yet been exercised for other

technology or facilities modification requests. Given the evaluation team’s

concerns in 2010 about inadequate integration of physical and technology

resource planning with broader institutional planning, it is especially important

for the College to ensure that execution of the new process during Spring 2012

demonstrates explicit consideration of resource requests in all four categories.

b) College personnel in a good position to know have expressed uncertainty about

how program review will actually affect resource allocation decisions when the

model is fully implemented. It is crucial for participants in the process to make

that linkage very clear to everyone, document both the process and its results

carefully, and disseminate that information effectively to the entire College

community.

9) The November 4, 2010 Campus Technology Committee (CTC) meeting discussed the

need for a systematic evaluation of technology needs. The Committee’s Spring 2011

plan to revise the campus Technology Plan did call for alignment with the six College

Strategic Goals, and for strategies for each technology goal to incorporate program

review in some fashion, although program review findings did not constitute an input

into the Plan. I do not know the results of these initiatives, since later CTC minutes

were not available on Infostore or the SJCC website.

Consultant Recommendations

1) The President, the Vice Presidents, constituency leaders, and especially the core

group of faculty, managers, and staff who have worked hard over the past year to

address the issues raised by the Commission should redouble their efforts this Spring

to inspire the following collective epiphany in the members of the College

community as a whole:

a) Recognize fully the urgent necessity of resolving those issues, for the long-term

viability of the institution and the good of its students.

b) Contribute willingly to resolving those issues.

c) Come to a realistic understanding that they must take responsibility for

maintaining changes for the better on a permanent basis, and building upon them.

Page 15: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 15 of 33

2) The College should immediately establish a permanent shared-governance Strategic

Planning Committee (SPC) to coordinate the strategic planning process, including

evaluation and improvement. SPC should report to CPC, and the Council should

delegate its responsibility for annual review of progress on Strategic Plan

implementation to SPC. SPC should also document, monitor, and help coordinate

improvements in the integration of lower-level planning and resource allocation

processes with the Strategic Plan; possibly it should also monitor integration of the

Educational and Facilities Master Plans with the Strategic Plan and each other, a task

now assigned to CPC. Alternatively, if creation of a new committee proves

untenable, CPC should formally assume all these responsibilities, as well as others

ordinarily undertaken by a strategic planning committee.

3) PUG, in close consultation with SPC, should accomplish the following tasks by early

Fall 2012:

a) Review and revise the Performance Indicators and associated definitions, status

data elements, and benchmarks for each Goal in light of the Goal language and

sound design and practice.

b) Fill in all gaps in the Performance Indicator matrix.

c) Develop and implement a work plan and schedule for gathering, analyzing,

interpreting, and communicating Indicator data on a regular basis.

d) Develop a methodologically sound process for periodically measuring the level of

knowledge and understanding of integrated planning and associated concepts

among members of the campus community; implement the first iteration of that

process to establish a baseline; and disseminate the findings appropriately.

4) Completing the First Full Cycle of Integrated Strategic Planning

a) SPC should accomplish the following tasks by October 1, 2012:

i) Coordinate an intensive effort to evaluate progress toward College Strategic

Goals based on the available Performance Indicators and other relevant

evidence as needed.

ii) Based on the results of the evaluation, recommend institutional improvements

to the President.

iii) Evaluate the content and approach of the Strategic Plan itself.

iv) Update the Strategic Plan based on the results of both evaluations.

v) Disseminate the updated Strategic Plan to the campus community, and make it

available also to the wider community served by the College.

b) By mid-October 2012, the President should initiate implementation of those SPC

recommendations she approves through the applicable College administrative

and/or shared-governance structures, as appropriate. Documented

implementation of as many recommendations as possible should commence

immediately, with implementation of the rest following as soon as College

processes permit.

5) By the end of Spring 2012 and annually thereafter, the Finance Committee, in

consultation with PRvC, should complete a formal, systematic evaluation of the

budget development process that is based on sound criteria and includes both

recommendations for improvements and a schedule for implementing those

improvements in the following Fall. The Finance Committee should effectively

disseminate its criteria and conclusions, and evaluate and improve its own evaluation

process for the next cycle. (See also Consultant Recommendations under College

Page 16: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 16 of 33

Recommendation 2 below regarding evaluation of the program review and resource

allocation process.)

6) Communication and Dialogue

a) The President should direct every management team member responsible for the

content of any page on the SJCC website to review that page and all links that

appear on it for accuracy and currency, and to initiate corrective action as needed.

The first round of review and corrections should occur by the end of Summer

2012, and the process should be repeated at least annually.

b) If it has not already done so, the College should develop and adopt a systematic,

accessible, easily performed process for uploading minutes and other important

documents to the Infostore repository. Every shared-governance committee, task

force, or other group (or at a minimum, those whose charges relate to integrated

planning or any of the other team Recommendations) should post all its minutes,

summaries, and/or other documents in timely fashion, so that Infostore holdings

are reasonably up-to-date at all times.

c) The College should undertake a periodic assessment of communication

effectiveness on campus, at least with respect to important issues such as

institutional effectiveness, student learning improvement, integrated planning,

program review, and accreditation, and of the extent to which campus dialogue on

such issues is “ongoing, robust, and pervasive.” Based on the results of the initial

assessment, the College should make improvements in communication methods,

and schedule and document an adequate number of systematic, regular, well-

designed opportunities for meaningful discussion within and among departments,

constituencies, and organizational levels on these and other important issues.

7) Resource Allocation Process

a) During the Spring 2012 resource allocation cycle, participating groups should

consider resource requests in all four categories (personnel, technology,

equipment/supplies, and facilities) in accord with the model in the Strategic Plan.

b) Every group with a significant role in the resource allocation process should

document its process fully during the Spring 2012 cycle and send a copy of the

documentation to PRvC. Based on that documentation, PRvC should develop a

description of the resource allocation process at a moderately detailed level,

focusing in particular on the linkage between program review findings and

resource allocations, and incorporate it into the San Jose City College Program

Review Information and Guidelines update scheduled for distribution in Fall

2012. If there are delays in production of that update past the beginning of

departmental work on 2012-13 program reviews, then PRvC should disseminate

the resource allocation process section separately, before the departmental work

begins.

8) If the CTC has not already completed a systematic evaluation of campus technology

needs, it should do so based in part on the technology needs described in the Fall

2012 program review submissions, and disseminate the results. Such an evaluation

should take place annually.

Page 17: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 17 of 33

College Recommendation 2: Program Review

In order to meet the standards, the college needs to review and refine its program review

processes to ensure they are systematic, linked to institutional planning and resource

allocation, and are used to assess and improve student learning and achievement. Policies

should be developed that clearly delineate the authority, responsibility and procedures for

the preparation, approval and communication of the results of program reviews. (I.B,

IIA.2f, IIB.4, IIC.2, IIIA.6, IIIB.2b, IIIC.2, IIID.3, IVA.5)

IB. Improving Institutional Effectiveness: The institution demonstrates a conscious effort to produce

and support student learning, measures that learning, assesses how well learning is occurring,

and makes changes to improve student learning. The institution also organizes its key processes

and allocates its resources to effectively support student learning. The institution demonstrates

its effectiveness by providing 1) evidence of the achievement of student learning outcomes and

2) evidence of institution and program performance. The institution uses ongoing and systematic

evaluation and planning to refine its key processes and improve student learning.

IIA.2.f. The institution engages in ongoing, systematic evaluation and integrated planning to assure

currency and measure achievement of its stated student learning outcomes for courses,

certificates, programs including general and vocational education, and degrees. The institution

systematically strives to improve those outcomes and makes the results available to appropriate

constituencies.

IIB.4. The institution evaluates student support services to assure their adequacy in meeting identified

student needs. Evaluation of these services provides evidence that they contribute to the

achievement of student learning outcomes. The institution uses the results of these evaluations as

the basis for improvement.

IIC.2. The institution evaluates library and other learning support services to assure their adequacy in

meeting identified student needs. Evaluation of these services provides evidence that they

contribute to the achievement of student learning outcomes. The institution uses the results of

these evaluations as the basis for improvement.

IIIA.6. Human resource planning is integrated with institutional planning. The institution systematically

assesses the effective use of human resources and uses the results of the evaluation as the basis

for improvement.

IIIB.2.b. Physical resource planning is integrated with institutional planning. The institution

systematically assesses the effective use of physical resources and uses the results of the

evaluation as the basis for improvement.

IIIC.2. Technology planning is integrated with institutional planning. The institution systematically

assesses the effective use of technology resources and uses the results of evaluation as the basis

for improvement.

IIID.3. The institution systematically assesses the effective use of financial resources and uses the

results of the evaluation as the basis for improvement.

IVA.5. The role of leadership and the institution’s governance and decision-making structures and

processes are regularly evaluated to assure their integrity and effectiveness. The institution

widely communicates the results of these evaluations and uses them as the basis for

improvement.

Observations: Progress to Date

1) San Jose City College has been engaged in a formal program review process in some

departments since at least 2005. The process has undergone numerous changes since

then, most recently in Spring 2011 for the 2011-12 cycle. PRvC is developing

another set of revisions during Spring 2012 for implementation in 2012-13.

2) The College is now on a four-year cycle of comprehensive program reviews; shorter

annual reviews are to be conducted in the second, third, and fourth years, and, judging

Page 18: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 18 of 33

from a sample of 2011-12 reports, focus primarily on justifying requests for

additional staffing. Assessment of human, physical, technology, and financial

resources, as well as assessment of outcomes, are built into the templates for both

types of review, though most submissions in that sample of annual reviews did not

actually address outcomes assessment.

3) Beginning this year, a program reportedly must have submitted its latest

comprehensive or annual review documents in order for any of its resource requests

to be considered in the resource allocation process.

4) The timeline presented in the Strategic Plan includes program review as an important

phase of integrated institutional planning.

5) Progress in Program Review

a) A tally as of July 18, 2011 showed that 24 of the 35 instructional programs (69%),

13 of the 22 student support programs (59%), and none of the six administrative

programs had completed a comprehensive program review within the last five

years.

b) Scheduled Comprehensive Program Reviews for 2011-12 through 2014-15, as of

September 1, 2011

i) Instructional Programs

All 35 instructional programs are scheduled for a comprehensive program

review over the next four years; 11 of those had no prior comprehensive

reviews over the past five years. 2011-12 2012-13 2013-14 2014-15

Prior PR done

2006-11

No PR done

2006-11

Prior PR done

2006-11

No PR done

2006-11

Prior PR done

2006-11

No PR done

2006-11

Prior PR done

2006-11

No PR done

2006-11

7 3 4 3 6 2 7 3

ii) Student Support Programs

Seventeen of the 22 student support programs are scheduled for a

comprehensive program review over the next four years; five are not, and

three of those five had no prior comprehensive reviews over the past five

years. 2011-12 2012-13 2013-14 2014-15

Prior PR

done 2006-11

No PR

done 2006-11

Prior PR

done 2006-11

No PR

done 2006-11

Prior PR

done 2006-11

No PR

done 2006-11

Prior PR

done 2006-11

No PR

done 2006-11

3 3 3 2 1 1 4 0

iii) Administrative Programs

All administrative programs except the Office of the President are scheduled

for a comprehensive program review over the next four years; the Office of

the President is not, and had no prior comprehensive review over the past five

years. 2011-12 2012-13 2013-14 2014-15

Prior PR

done

2006-11

No PR

done

2006-11

Prior PR

done

2006-11

No PR

done

2006-11

Prior PR

done

2006-11

No PR

done

2006-11

Prior PR

done

2006-11

No PR

done

2006-11

0 1 0 2 0 1 0 1

6) Program Review Coordination

a) The College established a central coordinating body for program review in Spring

2011: the Program Review Workgroup of the Accreditation Oversight Task

Force. The shared-governance PRvC succeeded it and assumed its functions in

Summer 2011. Instruction and Student Services representatives serve on the

Page 19: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 19 of 33

Committee, but Administrative Services is evidently not represented. The

Committee, which meets only four times per year but has subgroups that assist

units in completion of their reviews, is supposed to be responsible for program

review coordination, quality control, integration, process evaluation, and

communication under the following charge:

i) Evaluate and provide feedback on the quality of program review documents

submitted by the reviewing units.

ii) Validate completed program review document and forward the document to

the strategic Planning Process.

iii) Monitor integration of [the] program review process with strategic planning.

iv) Provide guidance to [the] college in the use of program review materials and

the process of program review.

v) Annually evaluate the effectiveness of the program review process and

policies and procedures related to program review, and recommend

improvements and revisions as needed.

vi) Report to the College Planning Council and the Academic Senate annually

regarding Committee findings and activities.

b) PRvC subgroups have reportedly begun evaluative work on the reviews submitted

in Fall 2011, and the Committee as a whole is scheduled to complete the

document validation process by March.

c) PRvC is scheduled to complete its evaluation of the 2011-12 program review

process by the end of Spring 2012. The criteria it will use, if they have been

developed, are not known to me.

d) At the time of the Self-Study, the Student Success Committee had responsibility

for coordinating program review in Student Services. Whether it still does is

unclear, as is its functional relationship to PRvC.

7) Documentation

a) In Spring 2011, the Program Review Workgroup documented the College’s

process in San Jose City College Program Review Information and Guidelines,

2011. PRvC updated the document in August 2011. In relatively brief form, it

includes background information, Committee information (including meeting

schedules), definitions of terms, a process timeline, rubrics and a reporting form

for the evaluation of program review submissions, a form for the annual review

that takes place between quadrennial comprehensive reviews, and a schedule for

completion of program reviews through 2014-15. In mid-Fall 2012, the

Committee intends to add a section on evaluation of the program review process.

b) The College appears to have settled on a mutually agreeable definition of a

program for program review purposes—an achievement with which some other

colleges are still wrestling. In almost all cases, the same definitions are evidently

used for program SLOs (PSLOs).

c) Instructional programs use a template approved in December 2010 for their

comprehensive program reviews. It does a good job of making the case for

program review, regardless of the availability of new resources; includes the

Mission, College Strategic Goals, and ISLOs for reference; and is comprehensive

in its coverage of program aspects.

d) Student Services programs use a separate, more structured template for

comprehensive program reviews based on one used at LA Harbor College. It is

Page 20: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 20 of 33

also comprehensive in its coverage of program aspects, and includes a concrete

five-year plan with goals, objectives, responsible persons, timelines, and budgets.

e) Definitive information on the template to be used in Administrative Services was

unavailable to me, though a set of additions proposed in July 2011 suggests that

the structure of the approach will be similar to that in Student Services.

8) Newsletters covering the activities of PRvC and SLOAC help keep the campus

community informed about the College’s progress in program review and outcomes

formulation and assessment.

9) Research Support

a) A standard set of data for instructional program reviews, drawn from Datatel and

reportedly now available to all programs on the Reporting Portal, includes student

demographic data; student performance data, mostly by ethnicity; productivity

figures; faculty demographics; and scheduling information over the past two

years.

b) Student Services programs evidently generate their own data in support of

program review from the Chancellor’s Office DataMart and departmental sources.

I do not know the sources of data for the scheduled Administrative Services

program reviews, the first of which has reportedly begun.

c) PUG has encouraged the use of the DataMart and CalPASS’ SmartTool in

gathering evidence for program reviews.

10) The evaluation team in 2010 found much confusion and lack of knowledge about

program review at the College. To address that issue, training opportunities on

program review, and on access to and use of program review data, have been offered

on Professional Development Days, in specially scheduled workshops, and in one-on-

one sessions with the former Interim Executive Director of RIE.

11) Program review documents are now stored in the centralized Infostore, accessible to

all College employees through any web browser.

Observations: Issues Requiring Action

1) Accreditation Requirements

a) In program review as in integrated planning, San Jose City College was supposed

to have achieved the highest level on the Commission’s Rubric for Evaluating

Institutional Effectiveness—Part I: Program Review, Sustainable Continuous

Quality Improvement (SCQI), over four years ago. The team found in November

that the College was just entering the Proficiency level, which has the following

characteristics:

Program review processes are in place and implemented regularly.

Results of all program reviews are integrated into institution-wide planning

for improvement and informed decision-making.

The program review framework is established and implemented.

Dialogue about the results of all program reviews is evident throughout the

institution as part of discussion of institutional effectiveness.

Results of program review are clearly and consistently linked to institutional

planning processes and resource allocation processes; college can demonstrate

or provide specific examples.

Page 21: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 21 of 33

The institution evaluates the effectiveness of its program review processes in

supporting and improving student achievement and student learning

outcomes.

The SCQI level requires more:

Program review processes are ongoing, systematic and used to assess and

improve student learning and achievement.

The institution reviews and refines its program review processes to improve

institutional effectiveness.

The results of program review are used to continually refine and improve

program practices resulting in appropriate improvements in student

achievement and learning.

2) Progress in Program Review

a) There is no centralized tool to facilitate tracking and reporting overall progress in

the program review process, so it has been difficult for the College to determine

that progress at times. The College wishes to obtain software tools for that

purpose, but efforts to begin evaluating competing systems stalled. The plan for

moving forward on the evaluation of those tools is unclear.

b) As of early December 2011, two of the 16 departments scheduled for

comprehensive program review during 2011-12 had refused to participate,

reportedly on the grounds that they had done program reviews too recently. I do

not know the extent to which the programs themselves participated in

construction of the schedule, nor how much notice PRvC or its predecessor gave

each program scheduled for a 2011-12 review.

c) The ACCJC rubric does not specify a proportion of program reviews that must be

done in order to reach the SCQI level, but given the fact that colleges are

supposed to have reached that level by 2007, it is reasonable for the Commission

to expect that by now, the proportion should be quite high in all areas of the

institution. Through 2010-11, the overall proportion of completed comprehensive

reviews within the last five years was 68%, slightly over two-thirds. By 2014-15,

if the College adheres to the current schedule, the overall proportion of completed

comprehensive reviews within the past five years will rise to 90%. Unless the

other six programs (five in support services, along with the President’s Office)

conduct comprehensive reviews by 2014-15, the College will still fall short of full

coverage.

3) Process Evaluation

a) The evaluation process on the basis of which the program review process has

been revised in the past is not clear. One interviewee decried the frequency of

change for the sake of change in the program review process over the last several

years, and at the same time the failure to make other changes that have been

needed. It is both admirable and appropriate to improve processes based on a

formal, careful evaluation of their effectiveness, but it is important also to gauge

the costs of such improvements in goodwill capital, particularly if they occur

frequently. (Sometimes it even becomes necessary to place a moratorium on

changes, just to give participants a chance to catch their collective breath.) A

more systematic approach to the evaluation and revision process should result in

fewer changes and smoother implementation from year to year.

Page 22: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 22 of 33

b) It is important for PRvC to develop, document, and disseminate sound criteria by

which to evaluate the process each year, if it has not already done so, and then to

evaluate its own evaluation process on an annual basis.

4) Communication and Dialogue

a) Opportunities for systematic institutional dialogue about program review results,

like those for widespread discussions of integrated planning, appear to be rare. It

is doubtful that dialogue about results can be accurately characterized as “evident

throughout the institution.” See Consultant Recommendations under College

Recommendation 1 above regarding improving communication and dialogue on

important issues.

b) The evaluation team suggested that the College add “a mechanism to share the

results of program review with broader college and district audiences,”

presumably to enhance its performance under Standards I.B.5 and II.A.2.f.

5) The effectiveness of training in improving participants’ understanding of the program

review process and of the evidence it requires has not been evaluated, to my

knowledge.

6) Documentation

a) The December 2010 template for instructional comprehensive program review

would benefit from judicious editing to eliminate redundancies; improve clarity,

precision, conciseness, and focus; and strengthen the connection between

evidence on the one hand, and conclusions and specific resource requests on the

other. It requires faculty to make projections regarding class sections, faculty,

support staff, library resources, technology, and facilities 15 years into the future,

but I have seen no sources of information on which faculty could rely to make

such extended projections. A sample of comprehensive submissions for the

current and immediately previous cycles indicates that most applicable faculty

have taken the program review process quite seriously, though programs are

uneven in their treatment of certain elements, such as SLO assessments and

interpretation of reported data. Some of the unevenness is likely attributable to

the template’s design.

b) The template for Student Services comprehensive program review would also

benefit from editing for clarity, consistency, and conciseness. The sample of

submissions I consulted was too small to support generalizations about its use.

7) Research Support

a) The integrity of program review data drawn from sources based on the

Chancellor’s Office MIS system, such as CalPASS and DataMart, ultimately rests

on the integrity of the data submitted by the College. Reportedly, many have

regarded those data as suspect, and only recently has systematic review of MIS

data, along with IPEDS submissions, begun in earnest.

b) The standard set of data provided for instructional program review does not

include all the data elements requested in the December 2010 instructional

program review template, nor does it cover the requested five years. Program

review participants typically cut and paste data tables into their document, if the

data they wish to include is available at all, because the template does not permit

automated integration of the data at present.

8) The evaluation team called for development of policies “that clearly delineate the

authority, responsibility and procedures for the preparation, approval and

communication of the results of program reviews.” The Guidelines and other

Page 23: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 23 of 33

documents set forth procedural requirements, but allocation of authority and

responsibility for program review preparation, coordination, and communication

across all institutional areas is still not entirely clear, at least to me. Nor do policies

exist that would meet this part of the Recommendation.

Consultant Recommendations

1) Progress in Program Review

a) In consultation with the affected programs and managers, the PRvC should

immediately add to the existing comprehensive review schedule the six programs

that otherwise will not have completed comprehensive program reviews within

the past five years in 2014-15.

b) PRvC should reschedule each of the two programs that refused to prepare a

comprehensive program review in 2011-12, with their input, for slots no more

than five years after their last comprehensive program reviews. If these two

programs’ refusal was indicative that programs in general had been given no

opportunity to provide input into the schedule, the PRvC should now give them an

opportunity to review the schedule and request changes within certain explicit

limits and constraints (e.g., the need to adhere to the five-year cycle, the need for

balance across the five years), with the proviso that failure to respond by the

deadline constitutes acceptance of the schedule as is. PRvC should document the

input process and disseminate its results. In any case, PRvC should ensure that all

programs are well aware of their program review deadlines.

2) By the end of Spring 2012 and annually thereafter, PRvC, in consultation with the

Finance Committee, should complete a formal, systematic evaluation of the program

review and resource allocation process that is based on sound criteria and includes

both recommendations for improvements and a schedule for implementing those

improvements in the following Fall. PRvC should effectively disseminate its

conclusions and the criteria on which they are based; evaluate and improve its own

evaluation process (including the criteria) for the next cycle; and document the

improved process in the Fall 2012 update of the Guidelines. (See also Consultant

Recommendations under College Recommendation 1 above regarding evaluation of

the budget development process.)

3) In consultation with IT staff, PRvC and the Student Learning Outcomes Assessment

Committee (SLOAC) should jointly coordinate the evaluation of software tools for

tracking and reporting both outcomes and program reviews, with selection of the

most appropriate system by June 2012, and testing and implementation by mid-Fall

2012.

4) To “share the results of program review with broader college and district audiences,”

the PRvC should consider preparing a thematic summary of program review results

each year, and making it widely available on the SJCC website.

5) PRvC should prepare a brief evaluation form to be completed by participants in all

training sessions, compile and analyze the results at least annually, and incorporate

the findings in its evaluation of the program review and resource allocation process.

6) As part of its revision of the program review process and documentation during

Spring 2012, PRvC should edit all templates and forms for clarity, precision,

consistency, conciseness, and focus, to improve the quality and consistency of

submissions. It should employ the new templates and forms beginning in Fall 2012.

Page 24: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 24 of 33

7) Research Support

a) By whatever means necessary, the College or District should provide clear

documentation of and easy access to standard reports of reliable data that fulfill at

least the minimum quantitative evidence requirements of the instructional

program review process.

b) To the extent that noninstructional program reviews as a matter of course rely on

common sets of data elements other than those used for instructional program

reviews, the College or District should provide clear documentation of and easy

access to standard reports of those data, too.

c) Especially if the College continues to use for program review CalPASS’

SmartTool, DataMart, or other reports that rely on data from the Chancellor’s

Office MIS databases, it must establish a systematic process for monitoring the

integrity of every Chancellor’s Office MIS submission, and correcting the data to

within acceptable standards before final submission.

d) The College or District should develop a process for reporting, investigating, and

resolving any significant concern that arises about the integrity of other

centralized data sources used in program review.

e) PRvC, in consultation with PUG, should ensure that adequate training in

accessing, using, and interpreting data for program review is available in timely

fashion for all participants.

f) PRvC, in consultation with PUG and District information technology staff, should

explore the possibility of creating electronic program review templates that more

fully integrate standard data reports and planning and resource allocation forms.

(See, for example, the system used by Crafton Hills College.)

g) The College, in consultation with District staff and its sister institution, should

investigate available options in user-friendly query programs suitable for use with

the District’s data systems, with the long-term goal of providing program review

participants with useful tools for designing ad hoc reports that will enrich their

understanding of their programs and the institution’s understanding of itself.

8) PRvC should document allocation of authority and responsibility for program review

preparation, coordination, and communication across all institutional areas, if such

documentation does not already exist, and incorporate it into the next revision of the

Guidelines. CPC should consider whether a policy is needed to address these matters,

and make a recommendation accordingly to the President.

Page 25: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 25 of 33

College Recommendation 3: Student Learning Outcomes

The team recommends that the college accelerate its efforts at identifying, assessing, and

communicating Student Learning Outcomes in order to meet the requirement that all

colleges will be at the proficiency level as identified by ACCJC by 2012. (IIA. 1 c,

IIA.2b, II.A.6, IIB.4, IIC.2, IIIA.1c)

Commission Reminder

The Commission expects that institutions meet standards that require the identification

and assessment of student learning outcomes, and the use of assessment data to plan and

implement improvements to educational quality, by fall 2012. The Commission reminds

San Jose City College that it must be prepared to demonstrate that it meets these

standards by fall 2012 (Standards I.B.1, II.A.2.e, II.A.2.f, II.B.4, II.C.2, II.A.1.c and

III.A.1.c).

IB.1. The institution maintains an ongoing, collegial, self-reflective dialogue about the continuous

improvement of student learning and institutional processes.

IIA.1.c. The institution identifies student learning outcomes for courses, programs, certificates, and

degrees; assesses student achievement of those outcomes; and uses assessment results to make

improvements.

IIA.2.b. The institution relies on faculty expertise and the assistance of advisory committees when

appropriate to identify competency levels and measurable student learning outcomes for courses,

certificates, programs including general and vocational education, and degrees. The institution

regularly assesses student progress towards achieving those outcomes.

IIA.2.e. The institution evaluates all courses and programs through an on-going systematic review of

their relevance, appropriateness, achievement of learning outcomes, currency, and future needs

and plans.

IIA.2.f. The institution engages in ongoing, systematic evaluation and integrated planning to assure

currency and measure achievement of its stated student learning outcomes for courses,

certificates, programs including general and vocational education, and degrees. The institution

systematically strives to improve those outcomes and makes the results available to appropriate

constituencies.

IIA.6. The institution assures that students and prospective students receive clear and accurate

information about educational courses and programs and transfer policies. The institution

describes its degrees and certificates in terms of their purpose, content, course requirements, and

expected student learning outcomes. In every class section students receive a course syllabus that

specifies learning objectives consistent with those in the institution’s officially approved course

outline.

IIB.4. The institution evaluates student support services to assure their adequacy in meeting identified

student needs. Evaluation of these services provides evidence that they contribute to the

achievement of student learning outcomes. The institution uses the results of these evaluations as

the basis for improvement.

IIC.2. The institution evaluates library and other learning support services to assure their adequacy in

meeting identified student needs. Evaluation of these services provides evidence that they

contribute to the achievement of student learning outcomes. The institution uses the results of

these evaluations as the basis for improvement.

IIIA.1.c. Faculty and others directly responsible for student progress toward achieving stated student

learning outcomes have, as a component of their evaluation, effectiveness in producing those

learning outcomes.

Page 26: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 26 of 33

Observations: Progress to Date

1) Course SLOs

a) Since 2002, outlines of all new and revised courses have been required to include

course Student Learning Outcomes (SLOs), according to the Self-Study. As of

October 2011, reportedly 63% of courses have defined SLOs, which are listed in

the course outline of record.

b) A review of nearly 100 course outlines of record indicates that the vast majority

of course SLO statements are reasonably well-constructed. However, many

courses list a relatively large number of SLOs in excessive detail, so that the set of

SLOs focuses more on a collection of discrete skills and knowledge than on

higher-level application of those skills and knowledge as the culmination of

learning in the course. In those cases, they resemble objectives more than

outcomes. Recent training has emphasized the characteristics of model outcomes

as distinct from objectives, so new course SLO sets are likely to reflect best

practices, and refinement of existing course SLO sets through the six-year

curriculum review cycle will no doubt move them in the same direction.

2) Program SLOs

a) A sampling of instructional PSLOs indicates that most sets are both well-

constructed and appropriate in scope.

b) A review of student services and support PSLOs indicates that they have focused

heavily on student learning outcomes per se, and with one exception (Library),

have not formulated service area outcomes as a basis for gauging their

effectiveness. Student Services programs are reportedly receptive to developing

service area outcomes. Two Student Services programs and one support program

included assessment approaches with their PSLO statements.

3) The College developed six ISLOs (referred to in the Standards as degree SLOs)

several years ago. They are listed in the 2011-12 Catalog. One ISLO has been

assessed so far.

4) Coverage of outcomes is explicit in all program review templates, which means that

the College has already achieved, at least on paper, one component of the SCQI level

on the applicable Commission rubric.

5) SLO Process Coordination

a) SLOAC was established by the Academic Senate on May 18, 2010. It first met in

September 2010, and was charged with addressing the deficiencies in the

College’s SLO process and developing best practices for that process. Its

approach is to mentor faculty in their work on SLOs, and explicitly not to

evaluate that work for quality control purposes. It also stresses the importance of

dialogue and collaboration. For 2011-12, the Committee is composed of nine

faculty members and one dean, who serves ex officio.

b) In 2010-11, the SLO Coordinator, a .60-FTEF reassigned-time position,

reportedly provided support for the SLO identification and assessment cycle, but

that function was assumed by SLOAC in Fall 2011. The Coordinator position

was flown again that semester, and will be filled beginning on January 30, 2012,

with two faculty members splitting the reassigned time evenly.

c) The Academic Senate and Faculty Association reportedly agree that SLO

assessment is a professional responsibility of faculty.

Page 27: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 27 of 33

6) Documentation and Training

a) SLOAC has created a rough draft of an SJCC Student Learning Outcomes and

Assessment Handbook. It describes ACCJC requirements, the role of SLOAC, the

goals of SLO assessment, and the nature of the SLO cycle; offers some definitions

and methodological guidance for assessment; and provides timelines and

reporting forms. It is focused on instructional SLOs per se at the course and

program levels, and adopts a mostly nonprescriptive tone.

b) Documented training opportunities on SLOs and assessment have been offered

since at least the January 2011 Professional Development Day.

7) The Academic Senate has reportedly made a commitment to push for completing the

formulation of all course and program SLOs by the end of Spring 2012. As part of

that effort, instructional deans are collecting information on the status of SLOs and

assessment in their divisions. Results are not yet available.

8) Assessment

a) The Academic Senate approved assessment forms for both course and program

SLOs in May 2011. The forms are concise, clear, and appropriately constructed.

They contain a section for reporting recommended improvements or SLO

changes, but not a section for reporting improvements or changes actually

implemented.

b) Completed SLO assessment forms are submitted to the applicable dean, who

reportedly records them in an Excel spreadsheet kept in the division office. All

instructional deans reportedly use a common format for their spreadsheets.

9) The evaluation team found that “outcome mapping of course, program and

institutional SLOs remains to be done.” However, a review of SLO and evidence

files in Infostore and the links on the SJCC PSLO website indicated the following

about SLO mapping across course, program, and institutional levels:

a) PSLOs have been mapped to ISLOs for the following programs, all but one of

which are in the Business and Technology division:

i) Accounting

ii) Business and Marketing

iii) Air Conditioning

iv) Communication Studies

v) Computer Applications/CIS

vi) Construction

vii) Cosmetology

viii) Dental Assisting

ix) FMT

x) Laser Technology

xi) Machine Technology

xii) Real Estate

b) SLOs in all Biology courses are mapped to Biology PSLOs and separately to

ISLOs.

c) SLOs in all Math courses are mapped to Math PSLOs and separately to ISLOs.

d) Bus 82 course SLOs are mapped to Business PSLOs, which in turn are mapped to

ISLOs. I found no other Bus courses with such mapping, but the fact that it has

been done for one course might imply that it has been done for others.

e) I found no current mapping of student services or support PSLOs to ISLOs,

although the Student Services program review template does ask each program to

Page 28: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 28 of 33

describe how the ISLOs are integrated into the unit’s work. The Self-Study

indicated that Student Services had developed three common divisional SLOs that

were based on the ISLOs in 2008-09, but I found little indication that any

departments used those SLOs as the basis of their PSLOs.

f) In summary, among 70 instructional, student services, and support programs, 14

(20%) have documented at least some cross-level SLO mapping, most of which is

PSLO to ISLO; the rest either show only PSLOs without maps to ISLOs (59%) or

do not have an active and correct link on the PSLOs website. No information

about PSLOs or service area outcomes in Administrative Services was available

to me.

g) There are scattered references to mapping that has taken place in other programs,

but I have not seen any documentation of such mapping in the sources available to

me.

Observations: Issues Requiring Action

1) Accreditation Requirements

a) To resolve the Recommendation, the College must demonstrate that it has reached

the Proficiency level on the Commission’s Rubric for Evaluating Institutional

Effectiveness—Part III: Student Learning Outcomes by Fall 2012. The

Proficiency level has the following characteristics:

Student learning outcomes and authentic assessment are in place for courses,

programs and degrees.

Results of assessment are being used for improvement and further alignment

of institution wide practices.

There is widespread institutional dialogue about the results.

Decision-making includes dialogue on the results of assessment and is

purposefully directed toward improving student learning.

Appropriate resources continue to be allocated and fine-tuned.

Comprehensive assessment reports exist and are completed on a regular basis.

Course student learning outcomes are aligned with degree student learning

outcomes.

Students demonstrate awareness of goals and purposes of courses and

programs in which they are enrolled.

b) The evaluation team prescribed the following steps to reach that goal, which, in

my judgment, represent a more charitable interpretation of the Commission’s

requirements than the College should risk:

i) Complete identification of course SLOs,

ii) Identify a significant portion of program SLOs,

iii) Map the course outcomes to program and institutional outcomes, [and]

iv) Augment the pace of assessment.

c) In working to reach Proficiency, the College should bear in mind that it must also

work toward the highest level on the Rubric, Sustainable Continuous Quality

Improvement (SCQI), which has the following characteristics:

Student learning outcomes and assessment are ongoing, systematic and used

for continuous quality improvement.

Dialogue about student learning is ongoing, pervasive and robust.

Page 29: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 29 of 33

Evaluation and fine-tuning of organizational structures to support student

learning is ongoing.

Student learning improvement is a visible priority in all practices and

structures across the college.

Learning outcomes are specifically linked to program reviews.

d) The team also noted under College Recommendation 5 that the College will fail

to meet Standard III.A.1.c unless “engagement in improving student learning as a

result of [SLO] assessments [becomes] a component of evaluation” of “faculty

and others directly responsible for student progress toward achieving stated

student learning outcomes” by 2012, when SLO assessments are to be

systematically implemented. Of course, this requirement, which has caused

consternation in many districts, is subject to negotiation, and therefore progress

depends on action at the District level.

2) Progress in the SLO Cycle

a) As of the Spring 2011 ACCJC Annual Report, according to largely self-reported

departmental figures:

i) 60% of all courses had defined SLOs (later updated in the Follow-Up Report

to 63%).

ii) Only 27% of courses had ongoing assessment of SLOs.

iii) Only 27% of instructional programs had defined SLOs.

iv) None of the instructional programs had ongoing assessment of SLOs.

v) Only 25% of student learning and support activities had defined SLOs.

vi) Only 19% of student learning and support activities had ongoing assessment

of SLOs.

vii) Ongoing assessment applied to only 17% of ISLOs.

b) General Education SLOs (GESLOs) have been neither identified nor assessed.

c) There is no centralized repository or tools for recording, monitoring, and

reporting the status of the SLO cycle at the College, so it is very difficult to

determine overall progress toward Proficiency. Each instructional dean, through

the administrative assistant, reportedly maintains his or her own Excel

spreadsheets on SLO assessments, and the disposition of the SLO assessment

reports themselves is unknown to me. It is unclear whether noninstructional

administrators employ any tracking tools. The College wishes to obtain software

tools for this purpose, but efforts to begin evaluating competing systems stalled.

The plan for moving forward on the evaluation of those tools is unclear. (See

Consultant Recommendations under College Recommendation 2 above.)

d) The College’s progress in completing the SLO cycle of assessment and

improvement is unclear, but likely minimal. I have found no documentation of

the implementation of improvements or reformulation of SLOs based on SLO

assessments, much less reevaluation of progress after implementation of

improvements.

e) The only schedule I have found for completing the SLO cycle of assessment and

improvement is the SLOAC statement that all SLOs should be assessed at least

every six years. But for this year, SLOAC is reportedly aiming for the completion

of a single course SLO in each course, and a single PSLO in each program, by the

end of Spring 2012. At that pace, given the sheer number of course and program

SLOs that exist, it seems unlikely that the College will complete the cycle for all

Page 30: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 30 of 33

course and program SLOs for many years to come, without a concerted effort to

accelerate the process.

3) Dialogue

a) SLOAC’s Handbook describes a tension between evidence and accreditation

requirements on the one hand, and dialogue and collaboration among faculty on

the other; it also explicitly privileges the latter over the former. In my judgment,

it is entirely correct to stress the fact that assessment of outcomes and

improvement of student learning represent sound practice, and the right thing to

do for the long-term benefit of the institution and its students, independent of

what the Commission requires. It is also crucial for participants in outcomes

assessment to discuss, interpret, and come to conclusions about them and about

how to improve student learning. But the concepts of assessment and

improvement presuppose dimensions along which learning may be measured, and

in turn the gathering of evidence for that measurement, whether qualitative or

quantitative. Dialogue without sound, relevant evidence of some kind, certainly

from the accreditation perspective, is extremely unlikely to improve student

learning.

b) Despite SLOAC’s emphasis on dialogue, opportunities for systematic dialogue on

outcomes assessment are rare. Institutional dialogue is arguably not

“widespread,” much less “ongoing, pervasive and robust.” Moreover, it is unclear

whether many faculty, staff, and managers would take advantage of them if they

were offered. Several interviewees commented that most people engage in

outcomes assessment, if at all, only because it is required, not because they

recognize its long-term value to the institution and its students. On the other

hand, one interviewee asserted that if the College can get faculty in a room talking

about improving education, student learning, and student success, good things

will happen.

4) In the absence of a set schedule, mapping of SLOs across levels has been slow and

unsystematic, despite repeated encouragement from SLOAC and its members.

5) Ensuring Quality

a) SLOAC supplies information and training on SLO formulation and assessment,

but explicitly disavows any role in evaluation or quality control. In fact, I have

found no documentation of any monitoring, evaluation, or quality control

mechanism to ensure that SLOs are appropriately formulated and assessed, and no

documentation that the process results in implementation of changes to improve

student learning. SLOAC has not yet engaged in any evaluation of the SLO

formulation and assessment process, nor of its own effectiveness in facilitating

that process.

b) A handbook or set of guidelines for formulation and assessment of service area

outcomes, or learning outcomes outside the context of a course, do not appear to

exist. Yet effective outcomes formulation and assessment in Student Services and

Administrative Services is sound practice in those areas, and is required by

ACCJC Standards.

6) Dissemination

a) There has been no overall assessment of the extent to which “students

demonstrate awareness of goals and purposes of courses and programs in which

they are enrolled.”

Page 31: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 31 of 33

b) Standard II.A.6 requires that students in every section “receive a course syllabus

that specifies learning objectives consistent with those in the institution’s

officially approved course outline,” but according to the Self-Study, only 40% of

a sample of syllabi met that Standard. It is unclear whether the consistency of

syllabi with course outlines of record has improved.

c) With rare exceptions, PSLOs do not appear in the 2011-12 online Catalog. The

College’s SLO webpage is not accessible through the standard website menus—

one must either do a search for it, or know the URL—so it is more difficult than it

should be to find information about SLOs and the SLO process.

Consultant Recommendations

1) Completion of Course and Program SLO Identification and Mapping: Two Options

a) Kickstart Option

i) The College should devote the next available Professional Development Day

entirely, or almost entirely, to the subject of outcomes: identification,

assessment, mapping, and completing the cycle. After all, what subjects are

more important for professional development than student learning and

institutional effectiveness? Faculty who have still not identified and mapped

the SLOs for the courses they teach, and managers, faculty, and staff in both

instructional and noninstructional programs for which PSLOs or service area

outcomes have not yet been identified and mapped—given appropriate

advance organizers and on-site support from SLOAC members and others—

could complete a substantial proportion of the task, if not all of it, in one day.

Those in need of methodological assistance in setting up or executing

assessment, in analyzing and interpreting results, or in recommending

improvements, could get it, and might complete some of those tasks. Faculty,

managers, and staff who already have their SLO process well in hand could

act as consultants for those who do not, or could engage in spirited discussions

about learning and effectiveness and how to facilitate it. These discussions

could plant the seeds of a lasting, institution-wide dialogue on these important

subjects. The SLO Co-Coordinators should ensure that a copy of each set of

newly formulated or mapped course or program SLOs is stored in a single

centralized electronic repository for reference, pending selection of a

systematic tool for tracking and reporting on the SLO cycle.

b) September Option

i) With the support of the President, the Vice President for Academic Affairs

(VPAA), and the Academic Senate, and without delay, the SLO Co-

Coordinators and SLOAC should create and coordinate the execution of a

plan for applicable instructional faculty to complete the identification of sound

SLOs in all remaining active courses and programs offered at the College no

later than September 30, 2012. As part of execution of that plan:

(1) SLOAC and the SLO Co-Coordinators, with the cooperation of

departmental faculty and division deans, should:

(a) Identify all the courses and instructional programs for which SLOs

remain to be developed, and provide that information to the VPAA,

deans, and applicable faculty.

Page 32: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 32 of 33

(b) Provide appropriate information, models, feedback, and other support

to applicable faculty in the development of the SLOs.

(2) Faculty should map each SLO to applicable SLOs at other levels, or

designate it as course- or program-specific, as they go.

(3) The deans should take an active role in ensuring that the applicable faculty

receive appropriate support in these efforts and that they complete the task

in timely fashion.

(4) The VPAA, with the assistance of IT and other staff as needed, should

ensure that at a minimum, one copy of each completed and mapped set of

new course and program SLOs and one copy of each set of mapped course

and program SLOs that already exists are stored in a single centralized

electronic repository for reference, pending selection of a systematic tool

for tracking and reporting on the SLO cycle.

ii) With the support of the President, the Vice President for Student Affairs

(VPSA) and VPAA, in consultation with applicable managers, faculty, and

staff and with the Academic Senate as needed, should create and lead the

execution of a plan for completing the identification of PSLOs in all

remaining student and learning support programs no later than September 30,

2012. Participants should map each PSLO to the applicable ISLO(s), or

designate it as program-specific, as they go. A copy of those mapped PSLOs

should be added to the repository.

iii) With the support of the President, the VPAS, in consultation with applicable

managers and staff, should create and lead the execution of a plan for

completing the identification of PSLOs in all Administrative Services

programs no later than September 30, 2012. Participants should map each

PSLO to the applicable ISLO(s), or designate it as program-specific, as they

go. A copy of those mapped PSLOs should be added to the repository.

2) SLOAC should develop and recommend to the Academic Senate and the President an

appropriate structure and process for the formulation and mapping of GESLOs, to be

completed no later than May 31, 2012, and for the systematic, scheduled assessment

of the GESLOs, to begin in earnest no later than September 30, 2012. Preliminary

work on GESLOs could be completed at the next available Professional Development

Day, if the Kickstart Option above is chosen.

3) Accelerating SLO Assessment and Completing the Cycle

a) Once firm plans are in place for completing the identification and mapping of all

course, program, and General Education SLOs, the SLO Co-Coordinators, with

the assistance of SLOAC, should focus a substantial part of their efforts during

Spring 2012 on accelerating SLO assessment and completing the cycle.

b) The College, through the appropriate governance structures and processes, should

set a specific, reasonable deadline for completing the assessment of each existing

course SLO and each PSLO, document and disseminate the schedule, and monitor

the work to ensure that each assessment occurs in timely fashion. As new SLOs

are added to the repository, a deadline should be set for each.

c) The SLO Co-Coordinators and SLOAC, perhaps through a train-the-trainer

approach or other method to leverage members’ expertise, should provide all

information and other support needed for participants to adhere to the schedule. If

necessary, the College should augment the membership of SLOAC.

Page 33: Accreditation Gap Analysis and Recommendations · Accreditation Gap Analysis and Recommendations Matthew C. Lee, Ph.D. January 12, 2012 Preamble At its January 2011 meeting, the Accrediting

Matthew C. Lee, Ph.D. January 12, 2012 33 of 33

d) SLOAC should add a section to the course and program SLO Assessment

Reports, or develop another reporting mechanism, to permit reporting of actual

completion of the SLO cycle through implementation of improvements.

e) SLOAC should adopt and disseminate a firm schedule for completing the cycle on

each remaining ISLO, and then adhere to it.

4) SLOAC should complete a full draft of the SJCC Student Learning Outcomes and

Assessment Handbook in time to circulate it for feedback in Spring 2012, and then

complete and distribute the final draft by the beginning of the Fall 2012 semester.

The Committee, in consultation with noninstructional managers, faculty, and staff,

should include explicit coverage of noninstructional areas and service area outcomes

and assessment, and should adopt a more authoritative tone in its recommendations of

best practices.

5) Even if the College does not choose the Kickstart Option above, it should

systematically schedule engaging opportunities for institutional dialogue about

student learning that is “ongoing, pervasive, and robust.”

6) Evaluation and Improvement

a) SLOAC should evaluate its own effectiveness and that of the SLO process by the

end of Spring 2012, and implement improvements by Fall 2012.

b) The College should conduct a thorough evaluation of its course and program

SLOs based on accepted standards in the field. If the evaluation demonstrates that

the SLOs overall are of acceptable or higher quality, then the College can

conclude that SLOAC’s nonevaluative approach to guiding the SLO process has

been effective. If, on the other hand, it demonstrates that too many SLOs are

inadequate or inappropriate, then the College should establish a mechanism for

the quality control of the SLO process.

7) Dissemination

a) PUG should explore options for the assessment of student awareness of course

and program goals and purposes, and make a recommendation to College

leadership on how best to gauge that awareness.

b) If the College has not already done so, it should establish a systematic process for

ensuring the inclusion of SLOs in all course syllabi consistent with those listed in

the corresponding course outlines of record.

c) The College should consider including PSLOs in the Catalog’s curriculum

descriptions.

d) The College should make the SLO webpage accessible through the standard

website menu structure.

8) The President should raise the issue of meeting Standard III.A.1.c (the relationship of

the outcomes process to personnel evaluations) with the Chancellor, and request that

negotiations on this issue begin as soon as practicable.


Recommended