+ All Categories
Home > Documents > Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence:...

Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence:...

Date post: 11-Sep-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
40
Evaluation of the Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150 Shattuck Avenue, Suite 800 Berkeley, CA 94704 Contact Beverly Farr, Ph.D. [email protected] 510-849-4942 August 2012 Prepared under contract to Academy of College Excellence, Cabrillo College through funding by the Bill and Melinda Gates Foundation
Transcript
Page 1: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

Evaluation of the Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011)

MPR Associates, Inc. 2150 Shattuck Avenue, Suite 800

Berkeley, CA 94704

Contact Beverly Farr, Ph.D.

[email protected] 510-849-4942

August 2012

Prepared under contract to Academy of College Excellence,

Cabrillo College through funding by the Bill and Melinda Gates Foundation

Page 2: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150
Page 3: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

Evaluation of the Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) Prepared by MPR Associates, Inc. 2150 Shattuck Avenue, Suite 800 Berkeley, CA 94704 510-849-4942 Beverly Farr, Ph.D. David Radwin, M.A. Susan Rotermund, Ph.D. August 2012

Page 4: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150
Page 5: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

iii

Contents

Executive Summary .................................................................................v Key Findings ...................................................................................................................... vi

1. ACE participants who took English were more likely than their nonparticipant counterparts to complete college-level English. ........................... vi

2. On average, ACE participants earned more college credits than nonparticipants, although not more transferable credits. ...................................... vi

3. Overall, ACE participants were more likely than nonparticipants to enroll full time in the semester following the ACE semester, but when part-time enrollment is included there is no difference overall. ........................................... vi

Introduction ...............................................................................................1 ACE Participants ................................................................................................................. 2 Implementation of the ACE Model ..................................................................................... 4 Purpose of the Study ........................................................................................................... 5 Research questions related to academic outcomes ............................................................. 6 Conceptual framework for analyses .................................................................................... 6

Cohorts ......................................................................................................................... 7 Data .............................................................................................................................. 8 Background characteristics ........................................................................................... 8 Limitations ................................................................................................................. 10 Outcome data .............................................................................................................. 11

Methods ............................................................................................................................ 12 Propensity score matching .......................................................................................... 12

Findings .................................................................................................. 15 Completion of college-level English ................................................................................ 15 Credit accrual .................................................................................................................... 19 Full-time enrollment after ACE semester ......................................................................... 23 Persistence ........................................................................................................................ 23 Summary of findings......................................................................................................... 24

References .............................................................................................. 27

Page 6: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150
Page 7: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

v

Executive Summary

The goal of the Academy for College Excellence (ACE) program is to develop a national model for recruitment, preparation, retention, and acceleration of underprepared commu-nity college students. Centered on the belief that underprepared students, especially disadvantaged young adults, often enter community colleges with the desire to better their lives but without the academic qualifications, professional skills, and personal behaviors necessary to succeed, ACE has intentionally focused on students with multiple challenges related to poverty and discrimination.

The ACE program is one semester long and consists of a) a two-week intensive Founda-tion Course that focuses on personal development and prepares students to be successful in college and b) a 12–16 week Bridge Semester of accelerated academic courses, including a project-based research course. The program is unique in that the students move through both portions of the program as a cohort, with a program design that consciously creates and develops, through curriculum in the classroom, a peer-support network that also facilitates their persistence and success in the program. ACE serves a varied student population. Almost all students are identified by ACE as either “at risk” or “high risk” but vary on other characteristics such as age, race, and gender.

The longitudinal evaluation of the ACE program (spring 2010-spring 2013) involves an analysis of academic outcomes of ACE students compared to a group chosen through propensity score matching, an analysis of psychosocial factors exhibited by ACE stu-dents, and a synthesis of qualitative data collected through site visits and open-ended survey items. This report expands on the results of the analyses of psychosocial factors and other qualitative data presented in a December 2011 report. It examines five im-portant milestones and momentum points for 773 students participating in ACE at Cabrillo College, Hartnell College, and Los Medanos College in the spring 2010, fall 2010, and spring 2011 semesters. It includes an analysis of key academic outcomes (e.g., credit accrual, retention, persistence, full-time enrollment, and successful comple-tion of gatekeeper English courses) for students participating in the ACE program. This report is the second of a series of reports that will be developed for this study. The remaining three report—to be produced in October 2012 as well as spring and fall 2013—will combine qualitative and quantitative outcomes.

Page 8: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

vi

Key Findings Outcomes for ACE participants are compared with the outcomes of a similar group of students sampled from student records at each college using propensity score matching. Results are disaggregated by college, program type, and semester.

1. ACE participants who took English were more likely than their nonparticipant counterparts to complete college-level English.

Overall, 57 percent of ACE participants who took English completed college-level English in the ACE semester, compared with 19 percent of non-participants, and this difference persists for the following two semesters. This result holds true for every college and program type with two exceptions. First, participants in the Hartnell agricul-tural program type were no more likely to complete college-level English in the ACE semester and were less likely to do so by the end of the first semester, but this result is based on a very small comparison of ten participants and ten non-participants. The other exception was the Cabrillo College ACE participants who enrolled in non-accelerated English in the ACE semester, who were less likely to complete college-level English in the ACE semester or subsequent semesters than the matched comparison group. As described below, due to local policy constraints, some Cabrillo College ACE participants in spring 2010 and fall 2010 took non-accelerated English, but in spring 2011 all ACE participants were required to take accelerated English.

2. On average, ACE participants earned more college credits than nonparticipants, although not more transferable credits.

On the whole, ACE participants earned 7 to 8 more college credits on average than non-participants, and this difference was true for virtually every college, program type, and semester.

ACE participants earned 1 to 4 fewer credits that are transferable to in-state public universities than non-participants, though neither group earned very many transferable credits. Again, this pattern applied across groups and time periods with few exceptions.

3. Overall, ACE participants were more likely than nonparticipants to enroll full time in the semester following the ACE semester, but when part-time enrollment is included there is no difference overall.

In total, 46 percent of ACE participants and 38 percent of non-participants enrolled full time in the first semester after the ACE semester, but the results varied considerably by college and program type. Los Medanos College ACE participants were 33 percentage

Page 9: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

vii

points more likely to enroll full time in the next semester, whereas Hartnell College ACE participants were 5 to 15 percentage points less likely to enroll full time.

When part-time enrollment is included, there was essentially no difference overall between ACE participants and non-participants in the percentage who enrolled in the semester after the ACE semester. But this finding masks the marked variation in persis-tence rates across ACE participants in particular colleges and program types, from a low of 30 percent in Hartnell College’s fall 2010 green building cohort to a high of 92 percent in Los Medanos College’s fall 2010 cohort. The differences between ACE participants and non-participants were more modest, with ACE participants being anywhere from 21 percentage points more likely to persist to 30 percentage points less likely to persist.

These results suggest several ways in which the ACE program is effective in guiding at-risk community college students toward ultimate success. Strong positive outcomes on several indicators provide evidence for early momentum associated with completion and transfer. At the same time, ACE participants earn slightly fewer transferable credits, on average, than non-participants, but neither group earns very many, considering that students typically need at least 60 credits to transfer to an in-state public university.

Future reports will examine additional groups of ACE participants and will track these groups for additional semesters.

Page 10: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150
Page 11: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

1

Introduction

The goal of the ACE program is to develop a national model for recruitment, preparation, retention, and acceleration of underprepared community college students. Centered on the belief that underprepared students, especially disadvantaged young adults, often enter community colleges with the desire to better their lives but without the academic qualifi-cations, professional skills, and personal behaviors necessary to succeed, ACE has intentionally focused on students with multiple challenges related to poverty and discrim-ination1. For all participating students, the goal is to develop professional career skills and the ability to navigate the professional work culture that includes the organizational and study skills, motivation and self-confidence, and academic skills needed for college success.

A primary objective of ACE is to accelerate student progress by providing a program that conveys a vision of academic life that often differs from that which is commonly held by disadvantaged students and an understanding of what it will take to succeed. The approach of the program integrates team management strategies, movement classes, primary research tasks, and academic and computer courses. ACE has been successful at accelerating student progress because of its unique features. It is an intensive, full-time program that immerses students in a new vision of what academic life entails and how they can succeed in higher education and professional careers.

The ACE program is one semester long and is divided into a two-week intensive Foundation Course that focuses on the development of professional skills and prepares students to be successful in college and a 12–16 week Bridge Semester of accelerated academic courses, including a project-based research course, most often a Social Justice course. The program is also unique in that the students move through both portions of the program as a cohort, with a program design that consciously creates and develops, through curriculum in the classroom, a peer-support network that also facilitates their persistence and success in the program. At the end of the ACE Bridge Semester, students

1 In this document, we refer to the type of students served by ACE as underprepared or high-risk. These terms conflate two different ways of characterizing them: They are high risk because of environmental factors—poverty, history of involvement with the judicial system, immigration status, drug abuse, etc., but they are also highly vulnerable or exhibit low levels of self-efficacy and self-esteem (See Diego Navarro, Supporting the students of the future. Change: The Magazine of Higher Learning, January/February 2012.)

Page 12: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

2

typically accumulate a full-time load of credits (12–16.5 credits), a larger number than the typical remedial program entails, propelling them down the road to completion.

MPR Associates (MPR) of Berkeley, California, is conducting an evaluation of the ACE model as it is implemented at six community colleges, including Cabrillo College (Aptos, CA) Hartnell College (Salinas, CA), Los Medanos College (Pittsburg, CA), Las Positas College (Livermore, CA), Berkeley City College (Berkeley, CA), and Delaware County Community College (Media, PA). The goal of the study is to conduct a rigorous longitu-dinal evaluation of the Academy for College Excellence (ACE) and of the various implementations of the model on the various campuses. At this point, all of the colleges named above have been implementing the ACE model, except Southwest Virginia College. In this report, we include only those colleges for which we have been able to collect outcome data from their college data systems, though future reports will include other colleges. All are colleges within California, and the data have been made available from the California Community College Management Information System (MIS). The transcript, course, demographic and assessment (for placement in English courses) data have been provided by the participating colleges, while the CSSAS data have been collected directly from students via an online data-gathering system.

ACE Participants ACE serves a varied student population in terms of gender, age, and race (Table 1). Males outnumber female students (58% vs. 42%). The majority of ACE students are in the 18–20 year-old age group (54%), but students of all ages participate in the program, including some who are 50 or older (5%). Hispanics comprise the single largest ethnic group served by ACE (43%), followed by white students (26%) and African American students (13%). The demographic profile of ACE students also varies by college (see Appendix). For example, reflecting the geographic locations of the participating colleges, Delaware County Community College, which primarily serves students from the predom-inantly African American community of Chester, Pennsylvania, has the highest percentage of African American students (56%) in its ACE program, while Cabrillo and Hartnell Colleges serve proportionately more Hispanic students (49% and 83%, respec-tively) in their ACE programs than do the other colleges. Cabrillo College serves the greatest percentage of students over the age of 30 (26%).

Almost all students are identified by ACE as either “at risk” or “high risk” for not enrolling in or completing a postsecondary program based on their responses to an intake survey filled out by all prospective ACE students. Students who have at least one “high risk”

Page 13: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

3

Table 1. Survey Respondents by Gender, Ethnicity, Age Group, and Risk Level (N=293)

Category Number Percent Gender

Female 124 42.3 Male 169 57.7

Ethnicity African American 39 13.3 Asian/Pacific Islander 12 4.1 Hispanic 126 43.0 White 75 25.6 Multiracial 14 4.8 Native American 7 2.4 Other 3 1.0 No Data 17 5.8

Age Group 18 to 20 155 54.4 21 to 24 43 15.1 25 to 29 29 10.2 30 to 39 30 10.5 40 to 49 15 5.3 50 or above 13 4.6

Risk Level Low Risk 3 1.0 At Risk 87 29.7 High Risk 195 66.6 No Data 8 2.7

factor (e.g., high school dropout, homeless, involved in a gang, grew up in foster care, or been in jail) are considered to be “high risk” students. Students who do not have any “high risk” factors but have one or more “at-risk” factors (e.g., English Language Learner, first-generation college, works full-time, or has children) are classified as “at risk.” Students with no risk factors are considered “low risk.” The intake survey includes 16 “high risk” factors and 12 “at risk” factors. The distribution of risk scores for the ACE participants who completed the CSSAS range from having no risk factors to having up to 12 high risk factors and up to 9 at risk factors. The majority of ACE students (97%) are classified as “at risk” or “high risk.” Table 2 presents data about the level of risk of ACE students included in the current report using a different source of data and showing specific risk factors. These data are drawn from intake applications and provide another picture of the at-risk factors associated with many ACE students.

Page 14: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

4

Table 2. Risk factors associated with ACE students

Risk Indicator Percent First Generation College (A) 62% Learning Difficulty (A) 48% Receives Government Benefits (A) 41% Parent is Agricultural Worker (A) 32% Unstable Home (H) 29% Working while in School (A) 28% Has Been Arrested (H) 24% Gang Association (H) 21% Parent with Dependent Children (A) 20% Domestic Violence (H) 18% Homeless (H) 18% Has Been on Probation (H) 17% Substance Abuse (H) 15% Child Abuse (H) 12% Medical Condition (H) 7% Currently on Probation (H) 7% Mental Condition (H) 6% Foster Care History (A) 5%

Note: A = At-risk indicator; H = High-risk indicator

Implementation of the ACE Model Evaluating a program such as ACE that has evolved over a number of years presents a number of challenges because of the need to account for differences in how the programs were implemented at participating colleges as well as for changes within colleges over time. The most significant variances occur in how the cohorts are created, designed, and run—i.e., courses, focus, students enrolled, number of credits, and time frame. In addi-tion, while there are certain core program attributes that are considered to be part of the “canonical” model, other characteristics may vary to accommodate local constraints and may be altered as the program continues to develop and innovate. A key factor has been whether students are included in an accelerated English course, which has varied over time depending on local policy. Another characteristic that can vary and affect outcomes is the program focus of the cohort. For example, in the results reported herein, a cohort at Hartnell College was identified as a Career and Technical Education (CTE) cohort, and the students recruited for that cohort as well as the courses included were different from those at other colleges. Therefore, their results are presented separately. Future reports will also present results from accelerated math cohorts at Los Medanos College.

Page 15: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

5

Purpose of the Study The purpose of the evaluation of the ACE program is designed to examine the effect of the program on student achievement and personal growth and to identify the aspects of the program that are most effective and the students who benefit most. It is also intended to examine the level of integrity of implementation of the program with respect to the inclusion of English acceleration and determine the effect of the level of integrity on student outcomes.

A previous report examined the effect of ACE participation on psychosocial characteris-tics that the ACE model is designed to affect because they are believed to be factors that mediate the impact of the program on academic outcomes. (“Evaluation of the Academy for College Excellence: Year I Interim Report”, December 2011.) Specifically, it com-pared ACE student performance on the CSSAS before participation in the ACE program, after the Foundation course, and after the Bridge Semester.

This report examines key academic outcomes—including credit accrual, persistence, full-time enrollment , and successful completion of college-level English—for students participating in the ACE program.2 It includes analyses of academic outcome data for ACE and a comparison group of students for students participating in ACE at Cabrillo College, Hartnell College, and Los Medanos College in the spring 2010, fall 2010, and spring 2011 semesters. It compares the achievement of ACE students to other students in each of the colleges using comparison groups constructed from the institutional and program data at the end of each semester and again two semesters subsequent to the Bridge Semester. Future reports will integrate both the two approaches to measuring student outcomes—analysis of psychosocial factors and comparison of measures of student achievement—along with qualitative data to describe implementation to provide a more comprehensive description of implementation and the effects of the ACE program on participants.

2 This part of the study aims to replicate the findings of a study conducted by Columbia University’s Community College Research Center (CCRC) in 2007 and will potentially add indicators that are deemed worthy of study. Note that the CCRC study evaluates the Digital Bridge Academy (the former name of ACE).

Page 16: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

6

Research questions related to academic outcomes While there is a larger set of questions that frame the overall evaluation, in this report, the analyses of the academic outcomes related to implementation of the ACE program address three main questions:

1. How likely are ACE participants to persist in college, to enroll full-time in subsequent semesters, and to pass college-level English courses as compared to a matched group of non-participants?

2. How well do ACE participants perform academically in terms of earning college-level and transferable credits as compared to a matched group of non-participants?

3. What is the effect of accelerated English—enrolling ACE participants in college-level English courses even though they place in remedial English courses one or more levels below college-level—compared to a matched group of non-participants and compared to ACE participants in the same college and term who were not accelerated?

Conceptual framework for analyses This analysis is motivated by the counterfactual model of causal inference, which defines the true causal effect of an intervention as the difference in outcomes in the presence of the intervention and in the absence of that intervention (Neyman, 1990 [1923]; Rubin, 1974; Holland, 1986; Morgan and Winship, 2007; Sekhon, 2009). The fundamental problem of causal inference, though, is that it is impossible to simultaneously observe both outcomes at once. Instead, the evaluation must try to approximate as closely as possible the answer to the question “What would have happened to these individuals if they had not had the intervention?” Randomized controlled trials (RCTs), or experiments, are generally considered the gold standard in establishing causality of interventions. Under most conditions, random assignment ensures that the group receiving the interven-tion is the same as the group not receiving the intervention, even on variables that cannot be adequately measured such as ability and motivation. RCTs are often infeasible, however, because of resource limitation and ethical concerns. In this situation, an RCT would have required the ACE program to turn away a proportion of interested students even if space were available, contravening the program’s stated goal of helping under-prepared students succeed in community college.

Page 17: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

7

Where random assignment is ruled out, researchers must use other methods to control for factors that affect both participation in the intervention and the outcomes of interest. One quasi-experimental method that is increasingly used in evaluation and social science research is matching, where participants are matched to non-participants with similar background characteristics. In particular, propensity score matching achieves this objective by statistically estimating each individual’s propensity to participate in the intervention based on pre-intervention measures and then matching participants and non-participants with the most similar propensity scores. Propensity score matching has been shown under certain conditions to produce estimates of program effects equivalent to estimates based on random assignment even where other methods such as regression fail (Dehejia and Wahba, 1999; LaLonde, 1986; Agodini and Dynarski, 2004; Peikes, Moreno, and Orzol, 2008). The primary limitation of matching is that it cannot control for unobservable factors, but this is equally true of regression methods and most other multivariate statistical techniques. The propensity score matching methods used here are discussed in more detail below.

Cohorts

As of fall 2011, the ACE program was operating at five community colleges in California and one in Pennsylvania. The outcomes in this report are based on three cohorts of California students starting in the spring 2010, fall 2010, and spring 2011 semesters at Cabrillo College in Aptos and Hartnell College in Salinas and two cohorts of students starting in fall 2010 and spring 2011 at Los Medanos College in Pittsburg (California). (Future reports will include additional colleges.) The students were tracked longitudinally from the first semester of participation in ACE (the “ACE semester”) through the end of spring semester 2011. Participation in ACE was operationally defined as the successful completion of the short Foundation Course with a grade of A, B, C, or P. The first or “bridge” semester immediately following the Foundation Course is referred to as the ACE semester. Depending on the college, enrollment in the Foundation Course (DMCP 110 at Cabrillo College, EDU 110 at Hartnell College, and HMSRV 110 at Los Medanos College) was recorded as either occurring during the ACE semester or during the abbre-viated summer or winter term immediately prior to the ACE semester.

As noted above, implementation of the ACE program has varied across and within colleges and over time to accommodate local policy decisions (Jenkins et al., 2009). For example, in spring 2010, Cabrillo College offered three versions of the ACE curriculum: an accelerated version in which participants enrolled in college-level English in the ACE semester even if placement exams referred them to remedial English, a non-accelerated version in which participants enrolled in remedial English one level below the college level in the ACE semester, and a version with a focus on Career and Technical Education (CTE) that was not focused on transfer to a four-year college and did not include English

Page 18: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

8

in the ACE semester at all. Although it was consistently part of the initial ACE program in 2003 and 2004, Cabrillo College officials eliminated the accelerated English compo-nent in 2005 (Jenkins et al., 2009). Later, officials allowed a partial reintroduction of accelerated English, so the spring 2010 and fall 2010 cohorts described in this report included both accelerated and non-accelerated program types. By spring 2011, all ACE participants at Cabrillo College were enrolled in the accelerated program type and took college-level English in the ACE semester. Similarly, in the semesters covered in this report, Hartnell College offered two CTE-focused ACE programs, one in agriculture that included an English course and one in green building that did not include an English course, as well as an academically-focused non-CTE ACE program. Depending on their educational and career goals, CTE students may intend to earn a certificate or associate’s degree as their terminal award, or they may intend to transfer to a four-year institution to earn a bachelor’s degree. Each of these goals has different educational requirements. Transfer to an in-state public university and earning an associate’s degree typically require completion of transfer-level English, and earning a certificate may not require English coursework at all. To reflect this variation, results are disaggregated by program type within college and semesters. Completion of college-level English is reported only for program types that included English as part of the ACE semester.

Data

The transcript, demographic and course data used in this report were obtained by ACE program staff directly from the three colleges. These data come from standard set of information that all California Community Colleges submit to the California Community College’s Chancellor’s Office Management Information Systems (MIS).3 These data contain an extensive set of enrollment behaviors and a more limited set of demographic and other background characteristics.

Background characteristics

Following Jenkins et al. (2009), this report uses the following background characteristics derived from MIS data elements as the basis for constructing a matched comparison group:

• Gender

• Race/ethnicity (indicators for white, African American, Hispanic, with other categories and missing as the treated as a reference category)

• Socioeconomic status, operationalized as the whether the student’s home ZIP code has 20% or more of households below the poverty line

3 http://www.cccco.edu/ChancellorsOffice/Divisions/TechResearchInfo/MIS/tabid/1275/Default.aspx.

Page 19: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

9

• Student’s age in years as of December 31 of the year of the ACE semester

• Whether the student graduated from high school

• Whether the student earned a GED or other type of high school equivalency

• Whether the student is a high school dropout

• Number of credits earned at current community college prior to ACE semester

• Whether the student had taken an English as a Second Language (ESL) course in the prior semester (used in analysis of Cabrillo College cohorts only)4

• The student’s placement level in English, in terms of levels below the college level.

Most of the elements were measured dichotomously, but squared terms for the student’s age and prior credits earned were included to account for extreme values. This specifica-tion further reduces the already small probability, for example, that an otherwise similar non-participant with many prior credits (who most likely has already overcome any initial obstacles to college success) would match to an ACE participant with few if any prior credits.

Table 3 reports descriptive statistics for these background variables by college and overall.

4 Jenkins et al. (2009) apparently measured ESL participation in a longer time series of past semesters at Cabrillo College, which may partly explain why they found higher rates of ESL among both participants (7%) and non-participants (4%). They concede that these are likely underestimates because they do not count ESL courses taken elsewhere. Examining course enrollments over more semesters might have identified more ESL students at all colleges.

Page 20: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

10

Table 3. Descriptive statistics of ACE participants

Spring 2010, Fall 2010, Spring 2011

Variable Cabrillo Hartnell Los

Medanos All three colleges

Pct N Pct N Pct N Pct N

All ACE participants 100% 461 100% 223 100% 49 100% 773

Male 62% 284 65% 144 45% 22 61% 450

White 37% 166 9% 17 23% 11 28% 194

African American 4% 19 5% 10 28% 13 6% 42

Hispanic 53% 238 82% 161 28% 13 59% 412

From high poverty ZIP code 6% 27 32% 71 0% 0 14% 98

High school graduate 67% 295 60% 130 81% 38 66% 463

Completed GED 19% 85 13% 27 13% 6 17% 118

High school dropout 8% 35 15% 33 6% 3 10% 71

ESL student 1% 5 0% 0 0% 0 1% 5

Placed at college-level English* 13% 48 7% 9 5% 2 11% 59 Placed one level below college-level English* 47% 178 29% 37 55% 21 44% 236 Placed two or more levels below college-level English* 40% 149 64% 83 39% 15 46% 247

Mean Mean Mean Mean

Age 26.2 22.9 22.8 24.9

Prior college credits earned 3.7 2.5 4.8 3.4

* Results exclude 26% of ACE participants overall with missing English placement data (19% at Cabrillo College, 42% at Hartnell College, and 22% at Los Medanos College).

Limitations

The MIS data used here lack several key items related to students’ background character-istics that are typically used in studies of college student success. Intake forms completed by ACE participants show that high percentages of them have background factors that put them at risk of not completing college (Jenkins et al., 2009). These risk factors include past substance abuse, participation in gangs, and having a criminal record. Regrettably, with the partial exception of having “learning English as a second language,” none of these risk factors could be reliably extracted from the MIS data for non-participants. A concerted effort was made to identify students who attended alternative or continuation high schools, but high school codes were missing for over half of ACE participants and non-participants.

Page 21: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

11

Similarly, no direct measures of students’ socioeconomic status, such as parental income and education, were available. Receipt of financial aid was considered and rejected because it is not considered a valid or reliable indicator of financial need for California community college students for a number of reasons, including relatively low fees, the high administrative burden of completing financial aid paperwork, restricted eligibility for the large proportion of students who enroll part time, and the limited English profi-ciency of many students (TICAS, 2007; Berkner and Woo, 2008). Instead, following Jenkins et al. (2009), a high percentage of households in poverty in the student’s home ZIP code was used as a proxy for socioeconomic status, recognizing that using ecological measures to infer individual-level correlations may be problematic (Robinson, 1950). No ACE participants at Los Medanos College were from low-income ZIP codes by this definition.

Future reports will incorporate a more extensive set of variables from student surveys including additional measures of socioeconomic status, intention to enroll full-time or part-time, high school type, and selected psychosocial factors that are emphasized in the ACE program such as mindfulness and self-efficacy.

Outcome data

This report uses the following indicators of student progress and success:

• Percentage of students who passed college-level English (one level below transfer-level and applicable toward an associate’s degree) during the ACE semester and by the end of the first two semesters following the ACE semester;

• Percentage of students enrolled full time (12 credits or more) at the same college one semester following the ACE semester;

• Percentage of students enrolled at the same college in the semester following the ACE semester and two semesters following the ACE semester;

• Mean cumulative number of college-level credits (applicable toward an associate’s degree) earned during the ACE semester and one semester following the ACE semester;

• Mean cumulative number of transferable credits (applicable toward an associate’s degree and toward transfer to a UC or CSU campus) earned during the ACE semester and one semester following the ACE semester.

These intermediate outcomes, while arguably meaningful in their own right, are im-portant because they have been shown to correlate with ultimate success in community

Page 22: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

12

colleges (as defined by completion of certificates and degrees and transfer to four-year colleges) (Horn & Radwin, 2012; Offenstein, Moore, & Shulock, 2010; Offenstein & Shulock, 2010; Moore, Shulock, & Offenstein, 2009; Leinbach & Jenkins, 2008; Calcagno et al., 2006; Adelman, 2005). For example, California Community College students typically need at least 60 transferable credits to transfer to a University of California or California State University campus with upper-division standing (Moore, Shulock, & Offenstein, 2009), and most associate’s degrees require at least 60 college-level credits (McCormick, 1999). Early accumulation of credits is the first step in the path to transferring and earning an associate’s degree. Likewise, because completion and transfer almost always require at least two years of full-time enrollment, persistence across semesters is all but necessary to achieve either of these goals.

These interim measures of student progress provide early feedback on the efficacy of the ACE program long before most students would be expected to graduate or transfer to a four-year college with upper-division standing. For instance, even the minority of community college students who earn an associate’s degree within six years still take over three years on average to complete (Green & Radwin, 2012). Delaying this analysis for three or more years while students progress through college would compromise the timeliness of this evaluation.

Methods

Propensity score matching

The estimated effects in this report are based on propensity score matching using a 1:1 nearest neighbor match without replacement. A student’s propensity score is the estimat-ed likelihood that the student would participate in the ACE program, regardless of whether he or she actually did, as a function of the student’s background characteristics. Propensity scores were generated using logistic regression and calculated as the predicted probability of participation in ACE. A student with a propensity score of .15, for exam-ple, has an estimated 15% probability of participating in ACE.

In plain language, each ACE participant is matched to the single non-participant with the most similar propensity score, and that non-participant is removed from the pool of available matches. In cases of ties where two or more non-participants with identical propensity scores were the closest matches to an ACE participant, the matched non-participant is selected randomly. ACE participants are only matched to non-participants

Page 23: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

13

within the same college, program type5 (e.g., accelerated, CTE, etc.), and semester. ACE participants with placement scores (74 percent overall, including 81 percent at Cabrillo College, 58 percent at Hartnell College, and 78 percent at Los Medanos College) and without placement scores were matched separately to non-participants with and without placement scores, respectively.

To illustrate the process, an ACE participant with a propensity score of .15 would be matched with a non-participant with the propensity score closest to .15. That non-participant would not be matched to a participant again. If there were no available non-participant with a propensity score exactly equal to .15, the process would seek out a non-participant with a score of .14 or .16, and so forth. If there were multiple available non-participants with propensity scores of .15, then one would be selected at random.

To ensure common support, ACE participants with a propensity score larger than the largest score for a non-participant or smaller than the smallest score for a non-participant were excluded from the analysis. This requirement excluded very few students.

To test balance between the ACE and comparison groups, the mean value for each variable was calculated for each group before and after matching. Balance was maxim-ized by iterated adjustments to the propensity score model. In college-cohort pairs where no ACE participants were ESL students (Cabrillo College in spring 2011, Hartnell College all three semesters, and Los Medanos in both semesters), non-participants who were ESL students were dropped from the pool of available matches before matching. The matching was implemented by the psmatch2 module in Stata/SE 12.1 for Windows (Leuven and Sianesi, 2003; StataCorp, 2012). The pre-matching and post-matching means, relative differences, t-statistics, and p-values are reported for each college-cohort pair.

As a check on the robustness of these results to other forms of estimation, a separate analysis, not reported here, used ordinary least squares and logistic regression to compare these and other outcomes for participants and non-participants while controlling for the same background factors. The total sample sizes for each regression, including partici-pants and non-participants, ranged from 2,994 to 12,760 depending on the outcome and the particular subset of students included. By and large the results of the regression mirrored the substantive results of the matching analyses reported here.

5 Program type is typically determined by the inclusion of particular courses above and beyond the “canonical” ACE model which includes the Foundation Course, Team Self-Management, the behavior system, and usually accelerated coursework—English or math.

Page 24: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150
Page 25: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

15

Findings

Tables 4 to 8 show findings related to the five outcome measures for the applicable ACE cohorts and the matched comparison group. The results are disaggregated by program type where possible. Each row in each table includes the number of ACE participants. The number of non-participants is the same because they are matched 1:1 to participants. Each row also shows the difference in mean values between ACE participants and non-participants, the standard error of the difference, and an indication if the difference is statistically significant at the .05 or .01 level.6

Completion of college-level English ACE participants were much more likely to complete college-level English by the end of the ACE semester than a matched group of non-participants, and this difference persists even one and two semesters after the ACE semester. For example, as shown in the first row of Table 4, 54 percent of ACE participants who started in the spring 2010 semester and 25 percent of non-participants who completed college-level English by the end of the ACE semester, a 29 percentage point difference that is statistically significant at the .01 level. By the end of the first semester after the ACE semester, a slightly higher percent-age of both ACE participants and non-participants have passed college-level English, but the difference remained at 29 percentage points (p < .01). And by two semesters after the ACE semester, the gap narrowed slightly to 25 percentage points (p < .01) as both groups were marginally more likely to have completed college-level English.

The story is similar for most other colleges, cohorts, and program types: ACE partici-pants were considerably more likely to complete college-level English than non-participants in the ACE semester, with differences ranging from 25 percentage points to 83 percentage points, though non-participants partly “caught up” in subsequent semesters

6 Statistical significance measures the probability that a sample would have yielded a difference of a given magnitude due to random sampling error if the true value of the difference in the popula-tion were zero—that is, if by chance the groups in the sample had different outcomes even though the outcomes were the same in the population. A typical standard for statistically significance is a less than a 5 percent probability that the difference could have been caused by chance (p < .05), and differences with a less than 1 percent probability of being caused by chance (p < .01) are even more highly statistically significant. A difference that does not reach statistical significance at the .05 level does not necessarily imply that there is no difference in the population but only indicates that there is at least a 5 percent probability that the difference could be due to chance.

Page 26: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

16

as the differences between the groups attenuated. There were two exceptions to this overall finding. First, among the non-accelerated Cabrillo ACE participants, those starting in spring 2010 were less likely to complete college-level English than non-participants during the ACE semester or the subsequent two semesters (p < .01 or p < .05, depending on semester). Those starting in fall 2010 were less likely to complete college-level English in the ACE semester but more likely to do so in the following semester, though neither result is statistically significant. The other exception was the Hartnell agricultural CTE ACE participants, but this result is based on just ten participants and does not reach statistical significance.

Page 27: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

17

Table 4. College-level English completion by semester

End of ACE semester End of first semester after ACE End of second semester after ACE College and program (number of participants) ACE semester ACE

Comp. group Diff. Std. error ACE

Comp. group Diff. Std. error ACE

Comp. group Diff. Std. error

Cabrillo accelerated (N = 69) Spring 2010 54.4 25.0 29.4 8.1 ** 57.4 27.9 29.4 8.2 ** 60.3 35.3 25.0 8.4 ** Hartnell non-CTE (N = 47) Spring 2010 61.7 10.6 51.1 8.5 ** 61.7 23.4 38.3 9.5 ** 61.7 23.4 38.3 9.5 ** Total (N = 116) Spring 2010 57.4 19.1 38.3 5.9 ** 59.1 26.1 33.0 6.2 ** 60.9 30.4 30.4 6.3 **

Cabrillo non-accelerated (N = 46) Spring 2010 0.0 19.6 -19.6 5.9 ** 13.0 30.4 -17.4 8.5 * 15.2 39.1 -23.9 9.0 **

End of ACE semester End of first semester after ACE

College (number of participants) ACE semester ACE Comp. group Diff. Std. error ACE

Comp. group Diff. Std. error

Cabrillo accelerated (N = 129) Fall 2010 48.4 23.4 25.0 5.8 ** 60.9 35.9 25.0 6.1 ** Hartnell agricultural (N = 10) Fall 2010 20.0 20.0 0.0 18.9 20.0 60.0 -40.0 21.1 Hartnell non-CTE (N = 56) Fall 2010 62.5 21.4 41.1 8.6 ** 62.5 44.6 17.9 9.4 Hartnell overall (N = 66) Fall 2010 56.1 22.2 33.8 8.1 ** 56.1 44.4 11.6 8.8 Los Medanos (N = 24) Fall 2010 70.8 12.5 58.3 11.7 ** 70.8 41.7 29.2 14.0 * Total (N = 219) Fall 2010 53.2 21.9 31.4 4.4 ** 60.6 39.1 21.5 4.7 **

Cabrillo non-accelerated (N = 24) Fall 2010 0.0 12.5 -12.5 6.9 33.3 25.0 8.3 13.3

End of ACE semester College and program (number of participants) ACE semester ACE

Comp. group Diff. Std. error

Cabrillo accelerated (N = 136) Spring 2011 58.1 18.4 39.7 5.4 ** Hartnell agricultural (N = 18) Spring 2011 83.3 0.0 83.3 9.0 ** Hartnell non-CTE (N = 29) Spring 2011 72.4 13.8 58.6 10.7 ** Hartnell total (N = 47) Spring 2011 76.6 8.7 67.9 7.6 ** Los Medanos (N = 21) Spring 2011 61.9 19.0 42.9 14.0 ** Total (N = 204) Spring 2011 62.7 16.3 46.5 4.3 **

*p < .05; **p < .01; two-tailed test NOTES: All Cabrillo students are accelerated in spring 2011. English outcomes exclude Hartnell Green Building and Cabrillo CTE cohorts that did not include an English course in the ACE semester.

Page 28: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

18

Figures 1 and 2 show the trajectory of college-level English completion for ACE students overall, excluding non-accelerated participants at Cabrillo College. Although the percent-age of students completing college-level English increases in both groups, the rate of change is slightly higher among non-participants and the gap shrinks faintly over time. This pattern suggests that the non-participants are beginning to “catch up” to the ACE participants. Another possible interpretation is that the effect of the ACE program on completion of college-level English tends to fade out after the ACE semester, but again, it would be difficult to say conclusively based on these limited data. In any case, requiring students to take college-level English as part of a broader program appears to be an effective technique for increasing their likelihood of completing college-level English, particularly on the first attempt, and enables them to take transfer-level English or other more advanced courses sooner than would otherwise be possible had they not completed college-level English.

Figure 1. College-level English completion by semester, spring 2010 cohort

Figure 2. College-level English completion by semester, fall 2010 cohort

0

20

40

60

80

100

End of ACE semester

End of first semester after ACE

End of second semester after ACE

ACE participants

Comparison group

0

20

40

60

80

100

End of ACE semester End of first semester after ACE

ACE participants

Comparison group

Page 29: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

19

Credit accrual Table 5 reports the mean values of cumulative number of college credits (applicable to an associate’s degree or CTE certificate but not necessarily transferable to a four-year college) earned during the ACE semester and the semester that follows. Associate’s degrees typically require 60 college credits, and certificate programs require anywhere from 0.5 to over 100 credits (Moore, Jez, Chisolm, and Shulock, 2012). Overall, ACE participants earned 7 to 8 more credits than non-participants (p < .01 in every case), depending on the semester. Participants starting in fall 2010 earned 12 credits on average by the end of the ACE semester and 16 credits by the end of the first semester after the ACE semester, compared with 5 and 10 credits, respectively, earned by non-participants. Similarly, ACE participants starting in spring 2011 averaged 12 credits compared with non-participants’ 4 credits. This pattern of ACE participants earning more credits than non-participants held true for virtually every college, program type, and semester. The only exception to the pattern was the Hartnell agricultural CTE ACE participants in fall 2010, but once again this result is based on just 10 participants and was not statistically significant.

In contrast, when it comes to credits that are transferable to an in-state public university, ACE participants earn fewer on average than non-participants, though neither group earned very many transferable credits. California community college students typically need a minimum of 60 transferable credits in order to transfer to a University of Califor-nia or California State University campus with upper-division standing (Moore, Shulock, & Jensen, 2009). Overall, ACE participants completed 1 to 2 transferable credits by the end of the ACE semester, compared with 3 credits for non-participants, and by the end of the first semester after the ACE semester, ACE participants earned 5 transferable credits compared with non-participants’ 7 credits (all differences p < .01). With the exception of the ACE semester outcome for Hartnell College’s green building cohort, this pattern held true for all colleges, program types, and semesters, with ACE participants earning 1 to 4 fewer credits than non-participants, though many of the differences are not statistically significant. This difference might be explained in part if some non-participants were better prepared to take transfer-level courses in the ACE semester above and beyond what would be indicated by their English placement levels and prior college credits would indicate. Future reports will attempt to determine if this is the case.

Page 30: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

20

Table 5. Cumulative college-level credits earned by semester

End of ACE semester End of first semester after ACE College and program (number of participants) ACE semester ACE

Comp. group Diff. Std. error ACE

Comp. group Diff. Std. error

Cabrillo accelerated (N = 128) Fall 2010 11.8 4.3 7.5 0.6 ** 16.4 8.5 8.0 1.1 ** Cabrillo non-accelerated (N = 24) Fall 2010 12.4 3.9 8.5 0.9 ** 16.8 6.4 10.4 1.8 ** Hartnell green building (N = 20) Fall 2010 8.3 3.1 5.2 1.3 ** 9.6 5.7 3.9 2.2 Hartnell agricultural (N = 10) Fall 2010 7.7 5.0 2.6 2.0 12.0 12.0 -0.1 4.1 Hartnell non-CTE (N = 56) Fall 2010 11.6 6.2 5.3 1.0 ** 17.1 13.0 4.1 1.9 * Hartnell overall (N = 86) Fall 2010 10.3 5.5 4.9 0.8 ** 14.8 11.2 3.6 1.5 * Los Medanos (N = 24) Fall 2010 13.4 5.2 8.2 1.5 ** 20.9 11.4 9.5 2.7 ** Total (N = 262) Fall 2010 11.5 4.8 6.8 0.4 ** 16.3 9.6 6.7 0.8 **

End of ACE semester

College (number of participants) ACE semester ACE Comp. group Diff. Std. error

Cabrillo accelerated (N = 136) Spring 2011 12.4 4.3 8.1 0.6 ** Hartnell agricultural (N = 18) Spring 2011 9.1 3.5 5.5 1.7 ** Hartnell non-CTE (N = 29) Spring 2011 10.1 3.8 6.2 1.2 ** Hartnell total (N = 47) Spring 2011 9.7 3.8 5.9 1.0 ** Los Medanos (N = 21) Spring 2011 13.1 3.9 9.2 1.3 ** Total (N = 204) Spring 2011 11.8 4.2 7.7 0.5 **

*p < .05; **p < .01; two-tailed test NOTES: All Cabrillo students are accelerated in Spring 2011.

Page 31: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

21

Table 6. Cumulative transferable credits earned by semester

End of ACE semester End of first semester after ACE College and program (number of participants) ACE semester ACE

Comp. group Diff. Std. error ACE

Comp. group Diff. Std. error

Cabrillo accelerated (N = 128) Fall 2010 0.9 2.9 -1.9 0.3 ** 4.0 5.6 -1.6 0.7 * Cabrillo non-accelerated (N = 24) Fall 2010 3.3 2.4 0.8 0.7 5.5 3.9 1.6 1.3 Cabrillo overall (N = 152) Fall 2010 1.3 2.7 -1.4 0.3 ** 4.2 5.4 -1.2 0.7 Hartnell green building (N = 20) Fall 2010 2.7 2.3 0.4 1.0 3.7 4.5 -0.8 1.7 Hartnell agricultural (N = 10) Fall 2010 1.4 3.1 -1.7 1.6 5.2 8.2 -3.0 3.1 Hartnell non-CTE (N = 56) Fall 2010 1.9 4.6 -2.7 0.6 ** 6.5 9.7 -3.2 1.4 * Hartnell overall (N = 86) Fall 2010 2.0 4.0 -2.0 0.5 ** 5.7 8.3 -2.6 1.1 * Los Medanos (N = 24) Fall 2010 0.5 4.0 -3.5 0.9 ** 7.3 8.3 -1.0 1.9 Total (N = 262) Fall 2010 1.5 3.3 -1.8 0.3 ** 5.0 6.6 -1.6 0.6 **

End of ACE semester

College and program (number of participants) ACE semester ACE

Comp. group Diff. Std. error

Cabrillo accelerated (N = 136) Spring 2011 1.0 2.9 -1.9 0.3 ** Hartnell agricultural (N = 18) Spring 2011 1.1 3.0 -1.9 0.9 Hartnell non-CTE (N = 29) Spring 2011 1.6 3.1 -1.6 0.6 * Hartnell total (N = 47) Spring 2011 1.4 3.1 -1.7 0.5 ** Los Medanos (N = 21) Spring 2011 0.4 3.3 -3.0 0.8 ** Total (N = 204) Spring 2011 1.0 3.0 -2.0 0.3 **

*p < .05; **p < .01; two-tailed test NOTES: All Cabrillo students are accelerated in Spring.

Page 32: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

22

Figures 3 and 4 show the mean number of college credits and transferable credits, respectively, for ACE participants and the comparison group overall in the ACE semester and the subsequent semester. These figures show that ACE participants earned substan-tially more college credits and slightly fewer transferable credits, on average, than non-participants. Unlike the case with completion of college-level English, there is no obvious indication that the difference in either college credit accumulation or transferable credit accumulation changed appreciably over time, although it is only possible to compare two points in time. Future reports will reveal whether this difference stays constant and whether either kind of credit accumulation gathers momentum or levels off.

Figure 3. Mean college credits by semester, fall 2010 cohort

Figure 4. Mean transferable credits by semester, fall 2010 cohort

0

5

10

15

20

End of ACE semester End of first semester after ACE

ACE participants

Comparison group

0

5

10

15

20

End of ACE semester End of first semester after ACE

ACE participants

Comparison group

Page 33: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

23

Full-time enrollment after ACE semester Full-time enrollment in the first semester after the ACE semester is reported in Table 7 for the 2010 cohort. Overall, 46 percent of ACE participants and 38 percent of non-participants enrolled full time in the first semester after the ACE semester, a difference of 8 percentage points that falls just short of statistical significance at the .05 level. The results varied considerably by college and program type. At the one extreme, Los Medanos College ACE participants were 33 percentage points more likely to enroll full time in the next semester than non-participants and Cabrillo accelerated participants were 16 percentage points more likely (both differences p <.05). At the other extreme, Hartnell College participants were 5 to 15 percentage points less likely to enroll full time in the following semester (no difference statistically significant).

Table 7. Percent enrolled full time in first semester after the ACE semester, fall 2010 semester

College (number of participants) ACE Comp. group Diff. Std. error

Cabrillo accelerated (N = 128) 49.2 33.6 15.6 6.1 * Cabrillo non-accelerated (N = 24) 37.5 29.2 8.3 13.8 Hartnell green building (N = 20) 10.0 25.0 -15.0 12.1 Hartnell agricultural (N = 10) 30.0 40.0 -10.0 22.4 Hartnell non-CTE (N = 56) 42.9 48.2 -5.4 9.5 Hartnell overall (N = 86) 33.7 42.3 -8.6 7.6 Los Medanos (N = 24) 79.2 45.8 33.3 13.4 * Total (N = 262) 45.8 37.5 8.3 4.4

*p < .05; **p < .01; two-tailed test

Persistence Table 8 reports one-semester persistence, defined as full-time or part-time enrollment at the same college in the semester after the ACE semester. Overall, 70 percent of ACE participants persisted one semester compared with 69 percent of non-participants, a difference of less than 1 percentage point before rounding (not statistically significant). As was the case with full time enrollment, one-semester persistence varied considerably across colleges and program types, from a low of 30 percent in Hartnell College’s fall 2010 green building cohort to a high of 92 percent in Los Medanos College’s fall 2010 cohort. The differences between ACE participants and non-participants, however, were more modest, and they also were inconsistent in direction and magnitude across colleges. On one end of the range, Los Medanos College participants were 21 percentage points more likely to persist, while at the other end, Hartnell College participants were 13 to 30 percentage points less likely to persist, depending on program type.

Page 34: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

24

Table 8. Percent persisted to first semester after the ACE semester, fall 2010 semester

College and program (number of participants) ACE

Comp. group Diff. Std. error

Cabrillo accelerated (N = 128) 75.0 65.6 9.4 5.7 Cabrillo non-accelerated (N = 24) 66.7 66.7 0.0 13.9 Hartnell green building (N = 20) 30.0 55.0 -25.0 15.5 Hartnell agricultural (N = 10) 60.0 90.0 -30.0 19.1 Hartnell non-CTE (N = 56) 64.3 76.8 -12.5 8.6 Hartnell overall (N = 86) 55.8 73.1 -17.3 7.4 * Los Medanos (N = 24) 91.7 70.8 20.8 11.1 Total (N = 262) 69.5 69.2 0.3 4.1

*p < .05; **p < .01; two-tailed test

Summary of findings Table 9 summarizes the results for the five outcomes described above. ACE participants were considerably more likely to complete college-level English than a matched compar-ison group, and they earned 7 to 8 more college credits on average in the ACE semester. The average ACE participant had earned about 2 fewer transferable credits cumulatively than the average non-participant in the ACE semester and still had earned 2 fewer transferable credits by the end of the first semester after the ACE semester. ACE partici-pants on the whole were more likely to enroll full time in the semester after the ACE semester, but they were no more or less likely to enroll when part-time enrollment is considered in addition to full time enrollment.

These results suggest several ways in which the ACE program is effective in guiding at-risk community college students toward ultimate success. Most participants completed college-level English in the ACE semester, a key requirement for associate’s degrees and for enrolling in to transfer-level English, despite placing into remedial English. This English completion rate is more than double the rate of a comparable group of non-participants. Similarly, ACE participants earn more than twice as many college-level credits on average as non-participants in the ACE, another early momentum point associated with completion and transfer. And ACE participants were more likely than non-participants to enroll full time in the semester following the ACE semester, enabling them to earn credits more rapidly.

The ACE program is less effective in other ways. Participants earn slightly fewer trans-ferable credits, on average, than non-participants, but neither group earns very many, considering that students typically need at least 60 credits to transfer to an in-state public university. Also, when part-time enrollment is factored in, ACE participants are no more likely to persist to the next semester after the ACE semester than non-participants.

Page 35: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

25

Furthermore, differences in outcomes across colleges show varying levels of apparent effectiveness. Most prominently, students in program types with non-accelerated English or no English at all in the ACE semester generally have less favorable outcomes, pointing to the likelihood that college-level English is a necessary condition for success. (As noted earlier, since spring 2011, the ACE program at Cabrillo College has required all students to take college-level English, regardless of placement level.) Uneven outcomes for full-time enrollment and overall persistence across colleges and program types may indicate that some configurations or local implementations may be more effective than others, but they may be a result of chance. And it remains to be seen whether these early successes translate into later academic outcomes: completion of certificates and associate’s degrees and transfer to four-year colleges, which will be examined in upcoming reports in which we will be able to track results over a longer time period for the early cohorts. Future reports will continue tracking these and other outcomes and are likely to yield greater insight on the effect of the ACE program on at-risk community college students.

Table 9. Summary of outcomes, students starting fall 2010 and spring 2011

Outcome ACE

participants Comp. group

Passed college-level English by end of ACE semester*

fall 2010 spring 2011

53.2%

62.7%

21.9%

16.3%

Passed college-level English by end of first semester after ACE semester*

60.6% 39.1%

Mean college credits earned during the ACE semester

fall 2010 spring 2011

11.5

11.8

4.8

4.2

Mean college credits earned by end of first semester after ACE semester

16.3 9.6

Mean transferable credits earned during the ACE semester

fall 2010 spring 2011

1.5

1.0

3.3

3.0

Mean transferable credits earned by end of first semester after ACE semester

5.0 6.6

Full-time enrollment in first semester after ACE semester

45.8% 37.5%

Persistence to the first semester after ACE semester

69.5% 69.2%

* Results exclude program types with non-accelerated English course or no English course in ACE semester. Note: Results are combined for all ACE participants unless otherwise noted. Maximum N = 262.

Page 36: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150
Page 37: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

27

References

Adelman, C. (2005). Moving into town and moving on: The community college in the lives of traditional-aged students. Washington, DC: Office of Vocational and Adult Education, U.S. Department of Education. Retrieved June 25, 2010, from http://www2.ed.gov/rschstat/research/pubs/comcollege/movingintotown.pdf

Agodini, R., and Dynarski, M. 2004. Are Experiments the Only Option? A Look at Dropout Prevention Programs. The Review of Economics and Statistics, 86(1): 180–194.

Berkner, L. K. and Woo, J. (2008, May). Financial Aid at California Community Colleges: Pell Grants and Fee Waivers in 2003–04. MPR Research Brief #3. Berkeley, CA: MPR Associates. http://www.mprinc.com/products/search.aspx?pubID=447

Calcagno, J. C., Crosta, P., Bailey, T., & Jenkins, D. (2006, October). Stepping stones to a degree: The impact of enrollment pathways and milestones on community college student outcomes. CCRC Working Paper No. 4. New York: Columbia University, Teachers College, Community College Research Center. Retrieved September 27, 2011, from http://ccrc.tc.columbia.edu/Publication.asp?uid=452

Dehejia, R. and Wahba, S. (1999). Causal Effects in Nonexperimental Studies: Reevaluating the Evaluation of Training Programs, Journal of the American Statistical Association 94: 1053–1062.

Green, C. and Radwin, D. (2012). Web Tables—Characteristics of Associate’s Degree Attainers and Time to Associate’s Degree (NCES 2012-271). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Retrieved March 22, 2012, from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2012271

Holland, P.W. (1986). Statistics and causal inference. Journal of the American Statistical Association 81(396):945–60.

Imai, K., King, G., and Stuart, E.A. (2008). Misunderstandings between experimentalists and observationalists about causal inference. Journal of the Royal Statistical Society, Series A 171: 481–502.

LaLonde, R. J. (1986), Evaluating the Econometric Evaluations of Training Programs with Experimental Data, American Economic Review 76:604–620.

Leinbach, D. T., & Jenkins, D. (2008, January). Using longitudinal data to increase community college student success: A guide to measuring milestone and momentum point attainment. CCRC Research Tools No. 2. New York: Columbia University, Teachers College, Community College Research Center. Retrieved November 1, 2010, from http://www.eric.ed.gov/PDFS/ED499922.pdf

Page 38: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

28

Leuven, E. and Sianesi, B. (2003). PSMATCH2: Stata module to perform full Mahalanobis and propensity score matching, common support graphing, and covariate imbalance testing. Version 4.04. http://ideas.repec.org/c/boc/bocode/s432001.html

Horn, L. and Radwin, D. (2012). The Completion Arch: Measuring Community College Student Success: 2012. New York: College Board. http://completionarch.collegeboard.org.

Jenkins, D., Zeidenberg, M., Wachen, J. and Hayward, C. 2009. Educational Outcomes of Cabrillo College’s Digital Bridge Academy: Findings from a Multivariate Analysis. Revised October 2009. Community College Research Center, Teachers College, Columbia University.

McCormick, A. C. (1999). Credit Production and Progress Toward the Bachelor’s Degree: An Analysis of Postsecondary Transcripts for Beginning Students at 4-Year Institutions (NCES 1999-057). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.

Moore, C., Jez, S. J., Chisolm, E., & Shulock, N. (2012). Career opportunities: Career technical education and the college completion agenda–Part II: Inventory and analysis of CTE programs in the California Community Colleges. Sacramento, CA: California State University, Sacramento, Institute for Higher Education Leadership & Policy. Retrieved February 28, 2012, from http://www.csus.edu/ihelp/PDFs/R_Career_Opportunities_part2_0212.pdf

Moore, C., Shulock, N., & Jensen, C. (2009). Crafting a student-centered transfer process in California: Lessons from other states. Sacramento: California State University, Sacramento, Institute for Higher Education Leadership & Policy. Retrieved June 26, 2011, from http://www.hewlett.org/uploads/documents/Crafting_a_Student-Centered_Transfer_Process_in_California.pdf

Moore, C., Shulock, N., & Offenstein, J. (2009, October). Steps to success: Analyzing milestone achievement to improve community college student outcomes. Sacramento: California State University, Sacramento, Institute for Higher Education Leadership & Policy. Retrieved November 1, 2010, from http://www.csus.edu/ihelp/PDFs/R_Steps%20to%20success_10_09.pdf

Morgan, S. L. and Winship, C. (2007). Counterfactuals and Causal Inference: Methods and Principles for Social Research. New York: Cambridge.

Neyman J. 1990 (1923). On the application of probability theory to agricultural experiments essay on principles. Sec. 9 Stat. Sci. 5(4):465–72. Transl. Dabrowska, D.M. and Speed, T.P.

Offenstein, J., Moore, C., & Shulock, N. (2010, April). Advancing by degrees: A framework for increasing college completion. Sacramento, CA, and Washington, DC: Institute for Higher Education Leadership & Policy and The Education Trust. Retrieved June 24, 2011, from http://www.csus.edu/ihelp/PDFs/R_AdvbyDegrees_0510.pdf.

Page 39: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

29

Offenstein, J., & Shulock, N. (2010, September). Taking the next step: The promise of intermediate measures for meeting postsecondary completion goals. Boston: Jobs for the Future. Retrieved November 1, 2010, from http://www.jff.org/sites/default/files/ATD_TakingtheNextStep_092810.pdf

Peikes, D. N., Moreno, L., and Orzol, S. M. (2008, August). Propensity Score Matching: A Note of Caution for Evaluators of Social Programs. The American Statistician, 62(3): 222–231.

Robinson, W. S. (1950). Ecological correlations and the behavior of individuals. American Sociological Review 15:351–57.

Rubin, D.B. (1974). Estimating causal effects of treatments in randomized and nonrandomized studies. Journal of Educational Psychology 66:688–701.

StataCorp LP (2012): Stata/SE 12.1 for Windows [computer program]. College Station, TX: StataCorp.

The Institute for College Access and Success (TICAS). (2007, December). Green Lights and Red Tape: Improving Access to Financial Aid at California’s Community Colleges. Berkeley, CA: Author. http://www.ticas.org/files/pub/Green_Lights_Red_Tape.pdf.

Page 40: Evaluation of the Academy for College Excellence · 2017. 6. 2. · Academy for College Excellence: Report on Academic Outcomes (Spring 2010–Spring 2011) MPR Associates, Inc. 2150

Recommended