Path to Accelerated Completion and Employment Evaluation Meeting July 31, 2012.

Post on 16-Dec-2015

213 views 1 download

Tags:

transcript

Path to Accelerated Completion and Employment

Evaluation MeetingJuly 31, 2012

2

New Growth Group

• New Growth is a full-service evaluation firm specializing in postsecondary education and workforce development. – Christopher Spence, Evaluation Project Manager– Joel Elvery, PhD, Data Analysis

• Partnering with Corporation for a Skilled Workforce for implementation assessment– Holly Parker, Ed Strong, Leise Rosman

3

Goals

• Measure the impact of strategies on student outcomes

• Capture the variety of approaches implemented for each strategy in the state

• Contribute to continuous improvement • Comply with USDOL evaluation requirements

4

Approach

• Two parts: – Impact assessment– Implementation assessment

• Two audiences– USDOL – Reporting and compliance– PACE colleges – More detail to ID best practices

• Approach tailored for each strategy in the original proposal

5

Impact Assessment

• Before-and-after research design

Measures - Before

Key Measures• Academic Progress Measures• Program Completion Rates• Employment Outcomes

Measures - After

6

Implementation Assessment

• Documentation of approaches at each college and how implemented– Interviews, questionnaires, etc.

• Informs initiative continuous improvement efforts in later stages

• Sets the stage for future student success agenda

7

The StrategiesStrategy Approach Cycle

1.1a Assessment/ placement practice and retake

• Assessment score increase• Improved placement outcomes

Term

1.1b Prior learning credit • # credits earned by PLA type• Relationship with academic progress

Term

1.2 & 1.3 Developmental course redesign/ elimination(Before and after comparison)

• Developmental requirement completion

• Program English/ math requirement completion

Term

2.1 Actively engage employers • Implementation assessment TBD

2.2 Streamline targeted programs(Comparison cohort approach)

• Student retention, academic progress, completion, and employment

Term

3.1 New guidance technologies • Implementation assessment TBD

3.2 Partnerships with WIBS to develop Virtual Career Centers

• Implementation assessment TBD

8

First Year TimelineNov - 12 Feb - 13 May - 13 Aug - 13Sep - 12

- 8/3 Surveys Employer Engagement, Developmental Education, Streamlining

- 9/1 Comparison cohorts defined

- Program Launch

- 11/14 Quarterly Report

- 11/ 14 Annual Report

- 1/21 College data reports

- 2/14 Quarterly report

- 3/15 First semester roll-up

- 5/15 Quarterly report

- 6/21 College data reports

- 8/14 Quarterly report

- 8/16 Second semester roll-up

- 9/21 College data reports

Notes: • Colleges still provide monthly progress reports to NWACC• Expect implementation assessment activities closer to end of first semester

Questions?

10

Contact Information

• Project Manager: Chris Spence, 216.767.6262, cspence@newgrowthplanners.com

• Impact Assessment: Joel Elvery, PhD: 216.375.6777, jelvery@newgrowthplanners.com

• Implementation Assessment: Holly Parker, 734.769.2900, hparker@skilledwork.org

PACE Impact Assessment

12

Data plan

• Quantitative evaluation design• Comparison cohort plan• Data requested• Data submission

13

Quantitative evaluation design

• Two purposes– Meet DOL requirements– Inform stakeholders whether new approaches are

increasing student success• Using before-and-after comparison– Focusing on cohorts engaged in targeted programs

in Fall 2012 vs. those in Fall 2010– More than what’s needed for DOL requirements

14

Comparison cohort plan

• Where possible, will use past cohorts from targeted programs as comparison group

• Gathering comparison data from ADHE– Except for some developmental education metrics

not in ADHE data• New programs or dramatically shortened

programs will have to be matched to other similar programs

• DOL convening in early August

15

Comparison cohort plan

• Next steps on comparison cohort plan– Learn about targeted programs & their duration– Develop groupings of programs– Write up cohort strategy for DOL– Get DOL approval– Inform colleges of any additional data need to

provide

16

Means of Data CollectionSource Type

ADHE Student-level course and program data, including data for comparison cohorts

Arkansas Research Center

Individual-level employment and earnings data

Colleges See next slides – individual-level data not captured by ADHE

17

Data required from colleges

1. Test scores & placement of students involved in assessment test preparation

2. Prior Learning Assessments3. Demographics of students in targeted programs

of study4. Completion of developmental education

requirements for students in targeted programs5. Historic data on developmental education

progress for past cohorts

18

Data on PREP Participants

• Who should be included– Every student who uses assessment test

preparation provided in conjunction with PACE grant, regardless of whether in targeted program

• What need to know– Identifying variables– Type of assessment test, placement before & after

readiness course• For math, reading, & English assessments

19

Data on PLA Participants

• Who should be included– Everyone who attempts to get credit through a prior

learning assessment• What need to know– Identifying variables– Total credit hours earned through PLA– Credit hours earned through each of the following

• Portfolio• Standardized test• Local test• Training

Questions on PREP or PLA data?

21

1 time student data

• Who should be included– All students enrolled in a targeted program of study– Includes students who began prior to Fall 2012 who

are still enrolled• What need to know– Identifying variables– Student demographics from intake form– Developmental ed. placements– Whether have completed developmental ed.

22

Term data

• Data to be reported each term for each student• Who should be included

– All students enrolled in a targeted program of study– Includes students who began prior to Fall 2012 who are still

enrolled• What need to know

– Identifying variables– Whether taking Technical Math & number of modules have to

take– Whether changed program of study & what new program of

study is– Whether completed developmental ed. requirements

23

Program-level data

• What programs should be included– Each targeted program included in PACE– A separate row for each different one

• What need to know– Identifying variables– Credit hours before & after redesign– 2-year dev. ed. math, reading, & English completion

rates for cohorts from Fall 2008, Fall 2009, & Fall 2010– 2-year college-level math, reading, & English completion

rates fro cohorts from Fall 2008, Fall 2009, & Fall 2010

24

Developmental education worksheet

• One for each targeted program of study• Need to know course numbers for– Redesigned dev. ed. classes– Technical math– Past courses that students would have taken in

place of these courses• Will be used to gather data on student

progress through developmental courses

25

How is your college using technical math?

• Will you have a modular technical math course this Fall?

• Is it replacing only developmental math?• Is it replacing only college-level math?• Is it replacing both?• If it is replacing both, will some students have to

do remediation prior to Technical Math?• Do your programs have additional math

requirements on top of Technical Math?

Questions on targeted program participant & program data?

27

Spreadsheets

• 1st sheet has list of variables, their definitions, & required format

• Other sheets are data table shells to be completed by colleges

28

Submission

• Data will contain confidential data• Each college will be given a password & will

use password protection built into Excel• Submission via secure Drop Box• Timing of submissions– Fall semester data – January 21– Spring semester data – June 21– Summer semester data – September 21

29

Wrap up data plan

• Only asking colleges for information can not get from other sources

• Your help is crucial because changes to dev. ed. large part of PACE initiative

• Especially true of historic dev. ed. completion data & PLA data

PACE Implementation Assessment

31

Why do an Implementation Evaluation?

• Tell the story behind the data• Contribute to Continuous Improvement• Share learning across locations• Stay on track with goals and funding

requirements• USDOL requirement

32

Overall Objectives

• Ultimate objective: capture lessons and best practices from your experiences that contribute to your ongoing efforts and the field in general1. Understand how you plan to implement the strategies2. Track early outcomes (findings and challenges) from

initial implementation3. Describe and share adaptations made in response to

these early outcomes4. Document lessons learned from modifications and

final outcomes

33

Our Approach to Evaluating Implementation

• Greater focus on qualitative information• Evaluation plan must be fluid and responsive• Each phase builds on the prior phase– Start up and end of grant period usually reflects

heaviest information gathering push• Timing is frequently subject to course

corrections

34

Key Topics of Inquiry

• For each of the three strategies outlined to USDOL: how has the strategy been implemented and how have students utilized/experienced it?– Describe key redesign features and approaches used in

implementing them, for example:• Personnel changes/additions• Professional development and peer learning activities• Specific models employed (i.e., CAEL, El Paso PREP, etc.)• Curricula and/or delivery innovations• New uses of technology• Involvement of external partners (employers, WIBs, etc.)• New roles for staff or faculty

35

Methods of Evaluating Implementation

• Document review– Relevant institutional policies– Curricula materials– Scheduling information– Informational/outreach materials

• Surveys• Interviews

– Phone and/or in person• On-site observation• Focus Groups

– On site

36

Implementation Evaluation Information Gathering Timeline

• Fall 2012 semester– Analyze information from initial surveys (due Aug. 3)– Document review

• Winter 2013 semester– Second round of surveys on planning progress (first half of semester) and

early lessons/challenges– Site visits (end of semester)

• Academic year 2013-14– Surveys to track implementation progress, adaptations– Phone interviews or other follow up if needed

• Fall 2014 semester– Final document review– Final surveys and close-out site visit

37

Before we go to lunch…

• Any questions about the implementation evaluation approach?

• Lunch discussion topics:– Reflect on data plan– What are the key student success priorities at your

institution?– What would be most useful (for your institution)

to learn during and after PACE implementation?

38

Next Steps

• Updates based on today’s discussion• Questions and clarifications• Cohort definitions• Rolling out analyses during the semester

39

Contact Information

• Project Manager: Chris Spence, 216.767.6262, cspence@newgrowthplanners.com

• Impact Assessment: Joel Elvery, PhD: 216.375.6777, jelvery@newgrowthplanners.com

• Implementation Assessment: Holly Parker, 734.769.2900, hparker@skilledwork.org

We look forward to working with you!