+ All Categories
Home > Education > Campus compact march 2016 conference (1)

Campus compact march 2016 conference (1)

Date post: 07-Apr-2017
Category:
Upload: katie-scollin-flowers
View: 109 times
Download: 2 times
Share this document with a friend
18
TEAGLE CONSORTIUM OF LIBERAL ARTS COLLEGES AND THE DEVELOPMENT OF THE CBL SCORECARD Dr. Margueritte Murphy, Faculty Advisor and Liaison, Center for Community Engagement & Service-Learning Kathleen S. Flowers, Director, Center for Community Engagement & Service-Learning Tuesday March 22, 2016 Campus Compact, 30 th Anniversary Conference
Transcript
Page 1: Campus compact   march 2016 conference  (1)

TEAGLE CONSORTIUM OF LIBERAL ARTS COLLEGESAND THE DEVELOPMENT OF THE CBL SCORECARD

Dr. Margueritte Murphy, Faculty Advisor and Liaison, Center for Community Engagement & Service-LearningKathleen S. Flowers, Director, Center for Community Engagement & Service-Learning

Tuesday March 22, 2016Campus Compact, 30th Anniversary Conference

Page 2: Campus compact   march 2016 conference  (1)

Agenda: 1. Introductions (Facilitators and Attendees)2. Story of the Teagle Consortium of Liberal Arts Colleges

and development of Community Based Learning Scorecard • Student scorecard• Community partner scorecard• Faculty scorecard

3. Implementation (what worked, and what didn’t!?)4. Q and A5. Small group discussions

• What does CBL assessment look like on your campus?

• If and how the CBL scorecards might assist in your efforts

Page 3: Campus compact   march 2016 conference  (1)

Aim: “to build a replicable process for assessing community-base learning courses and programs”  Systematically assess the value added of CBL programming

on student learning and civic engagement, using the CBL Scorecard we developed for measuring CBL course/program effectiveness;

Close the assessment loop by developing a process for applying Scorecard results to course/program improvement and by broadly disseminating and encouraging the use of the protocol and collected data institutionally, regionally and nationally and;

Expand and sustain a consortium of liberal arts colleges

committed to establishing and sharing effective practices for the assessment of community-based learning. Two sub-goals for the consortium are to:

Disseminate information about the impact of CBL on student cognitive learning and Create a culture of assessment on the campuses of participating institutions.

TEAGLE CONSORTIUM OF LIBERAL ARTS COLLEGE AND THE DEVELOPMENT OF THE CBL

SCORECARD

Page 4: Campus compact   march 2016 conference  (1)

 The two primary elements of our plan for achieving these goals are(1)a fully-developed, tested and replicable assessment protocol that

can be broadly disseminated (both the instruments and the data collected); and

(2)the creation of a community of practice for liberal arts colleges to implement the protocol on their campuses and share data and best practices about community-based learning. (from Report to the Teagle Foundation 2008)

 CBL Scorecard is based in already-existing research studies of practices that promote learning in community-based learning courses and programs.

TEAGLE CONSORTIUM OF LIBERAL ARTS COLLEGE AND THE DEVELOPMENT OF THE CBL

SCORECARD

Page 5: Campus compact   march 2016 conference  (1)

Refinement of instrument: Vanderbilt University consultants Drs. John Braxton and Willis Jones (now at the University of Kentucky) undertook an extensive literature review on community based learning and the development of best practices for CBL programs in higher education. They grouped these best practices into 4 overarching “domains of practice”:  (1) placement quality (2) application and connection to academic learning (3) reflection (4) quality of community partnerships. Braxton and Jones sent factors to national service-learning experts and, applying their weightings, further refined the instrument.  

Page 6: Campus compact   march 2016 conference  (1)

Consortium administered CBL Scorecard across member institutions and multiple disciplines over several years.

We believe that our approach is valuable in both its focus on the course or program as the unit of analysis and its foundation in existing research on effective service learning practice.

Domain Observations

Mean SD Min Max

Applica 114 67.10658

11.52938

32.15 84.8

ComPart 99 63.00253

9.33377 38.5 77

PlaceQual 114 80.41886

10.33977

55 96

Reflect 107 70.8972 10.23941

44.75 90

Total 90 281.7683

35.85141

197.15 347.8

Summary Stats per Domain: Students only (Fall 2010 / 148 students / 14

classes @ 5 institutions)

Placement Quality Application & Connection Reflection Community

Partners3=strongly disagree

3.54=strongly disagree

2.5=strongly disagree

3.85=strongly disagree

6=disagree 7.08=disagree 5=disagree 7.70=disagree9=agree 10.62=agree 7.5=agree 11.55=agree

12=strongly agree 14.16=strongly agree 10=strongly agree 15.40=strongly

agree

Summary Stats per Institution, HWS Example:85 students – 4 classes

Page 7: Campus compact   march 2016 conference  (1)

“CBL Scorecard 2.0”

Analysis evolved with input from consortium members

Page 8: Campus compact   march 2016 conference  (1)

Community of Practice… and indirect by tangible and positive outcome!

Page 9: Campus compact   march 2016 conference  (1)

Dr. Susan Dicklitch, Franklin & Marshall College. Gov. 425 - Human Rights + Human Wrongs

Campus Conversation: Survey submissions and analysis; “Self-assessment Instrument for Service-Learning Sustainability”

Page 10: Campus compact   march 2016 conference  (1)
Page 11: Campus compact   march 2016 conference  (1)
Page 12: Campus compact   march 2016 conference  (1)
Page 13: Campus compact   march 2016 conference  (1)

Our goal was to make the instrument useful as a classroom diagnostic tool that can be readily used by instructors without professional interpretation.

• Student scorecard• Community partner

scorecard• Faculty scorecard

Page 14: Campus compact   march 2016 conference  (1)

“I mean it’s been extremely beneficial, absolutely. It’s helpful on the front end about how you think about shaping a course to meet those criteria. So knowing that there’s going to be this assessment stuff, it really does help shape the way you’re thinking about a course, rather than, ‘Oh, my gosh. I gotta get this together” and it’s a narrow sense. You think more broadly about course design because you have a scorecard as a framework to operate out of.”

“. . .it helped me realize what I wasn’t doing that I needed to do. I kind of knew what it was. It’s just in order to do it you really need to invest a lot more time than what I had put in. It was helpful in that way to clarify for me knowing what the best practices were in the literature and in-the-field that I wasn’t hitting all those best practices.”

“But how it (the scorecard) had functioned for myself and other faculty here, who were teaching, it helped you think more comprehensively about what we wanted for our course - how we wanted our course. So that intentionality really helped.”

What did the faculty say about

it? .

Page 15: Campus compact   march 2016 conference  (1)

What did our community partners say about it?

Community partners selected to provide feedback on their direct observations of students (over the Teagle Scorecard)

Page 16: Campus compact   march 2016 conference  (1)

Thank you!Feel free to contact us with questions, or suggestions!

[email protected] or [email protected]

1.How might the scorecard assist your campus CBL assessment efforts?

2.Will it build upon existing successful strategies? If so, how?

3.Please share resources you’ve found helpful. 4.What challenges do you anticipate facing if you choose

to implement the scorecard - and can we collectively brainstorm a solution?

Small Group Discussion

Page 17: Campus compact   march 2016 conference  (1)

APPENDIX II: SOURCES CITED – DEVELOPMENT OF SCORECARD

Alexander W. Astin, Lori J. Vogelgesang, Elaine K. Ikeda and Jennifer A. Yee, How Service Learning Affects Students (Los Angeles: Higher Education Research Institute, University of California, 2000).Executive Summary: http://www.gseis.ucla.edu/heri/PDFs/rhowas.pdf

Alexander W. Astin and Linda J. Sax., “How Undergraduates are Affected by Service Participation,” Journal of College Student Development 39:3 (1998): 251-263.

Alexander W. Astin, Linda J. Sax, and J. Avalos, “Long Term Effects of Volunteerism During the Undergraduate Years,” Review of Higher Education 22:2 (1999): 187-202.

G. M. Bodner, T. L. B. McMillen, “Cognitive Restructuring as an Early Stage in Problem Solving,” Journal of Research in Science Teaching 23:8 (Nov 1986), 727-37.

L. Carter, Using Situational Leadership to Reach the Whole Population, paper presented at the Association for General and Liberal Studies Conference (Daytona Beach: October 1996).

Campus Compact, Introduction to Service-Learning Toolkit: Readings and Resources for Faculty, Second Edition (Providence: Brown University, 2003), 7-10.

S. L. Enos & M. L. Troppe, “Service-Learning in the Curriculum,” Service-Learning in Higher Education: Concepts and Practices (San Francisco: Jossey-Bass Publishers Inc., 1996), 156-181.

Janet Eyler and Dwight E. Giles, Jr., Where’s the Service in Service-Learning? (San Francisco: Jossey-Bass, 1999).

Janet S. Eyler, Dwight E.Giles, Jr., Christine M. Stenson, and Charlene J. Gray, At A Glance: What We Know about The Effects of Service-Learning on College Students, Faculty, Institutions and Communities, 1993- 2000: Third Edition (Corporation for National Service Learn and Serve America National Service Learning Clearinghouse: 2001) http://www.servicelearning.org/filemanager/download/4192_AtAGlance.pdf V. J. Fabry, R. Eisenbach, R. R. Curry, & V. L. Golich, “Thank you for asking: Classroom assessment techniques and students’ perceptions of learning,” Journal on Excellence in College Teaching, 8:1 (1997), 3-21.

Andrew Furco, “Self-Assessment Rubric for the Institutionalization of Service-Learning In Higher Education,” Service-Learning Research & Development Center University of California, Berkeley (revised 2002). http://servicelearning.org/filemanager/download/Furco_rubric.pdf

J. A. Galura & J. P. F. Howard, Praxis I: A Faculty Casebook on Community Service Learning (Ann Arbor: The OCSL Press, 1996), 5-9.

G. Gibbs & M. Coffey, “The Impact of Training of University Teachers on their Teaching Skills, their Approach to Teaching and the Approach to Learning of their Students,” Active Learning in Higher Education: The Journal of the Institute for Learning and Teaching, 5:1 (Mar 2004), 87-100.

Page 18: Campus compact   march 2016 conference  (1)

Greene and G. Diehm, “Educational and Service Outcomes of a Service Integration Effort,” Michigan Journal of Community Service Learning, 2 (1995): 54-62.

E.P. Honnet and S.J. Poulen, Principles of Good Practice for Combining Service and Learning: A Wingspread Special Report. (Racine, WI: The Johnson Foundation, Inc., 1989).http://www.servicelearning.org/resources/online_documents/service-learning_standards/

L. W. Keig & M. C. Waggoner, “Peer review of teaching: Improving college instruction through formative assessment,” Journal on Excellence in College Teaching, 6:3 (1995), 51-83.

D. A. Kolb, Experiential Learning: Experience as the Source of Learning and Development (Upper Saddle River: Prentice Hall Inc., 1984).

A. E. Lawson, D. L. Banks, M. Logvin, “Self-Efficacy, Reasoning Ability, and Achievement in College Biology,” Journal of Research in Science Teaching, 44:5 (2003), 706-24.

Barbara E. Moely, Sterett H. Mercer, Vincent Ilustre and Devi Miron, “Psychometric Properties and Correlates of the Civic Attitudes and Skills Questionnaire (CASQ): A Measure of Students’ Attitudes Related to Service-Learning,” Michigan Journal of Community Service Learning, 8:2 (2002): 15-26.

R. Nasser, J. Carifio, The Effects of Cognitive Style and Piagetian Logical Reasoning on Solving a Propositional Relation Algebra Word Problem, paper presented at the New England Educational Research Organization (Portsmouth NH: April 1993).

S. A. Nunes & T. F. Halloran, Elements for a Training Design: An Assessment of Competencies for Effective ABE Instructors, Florida State Department of Education, Tallahassee (1987).

H. Seeman, “A Rationale and Methodology for Assessing and Awarding Credit for Student’s [sic] Experience in Cooperative Education,” Workplace Education 2:2 (Nov-Dec 1983), 11-12.

N. S. Shapiro, “Rhetorical Maturity and Perry’s Model of Intellectual Development: A Study of College Students’ Writing and Thinking,” Research Report (Educational Resources Information Center: 1985).

M. K. Snooks, S. E. Neeley, & L. Revere, “Midterm student feedback: Results of a pilot study,” Journal on Excellence in College Teaching, 18:3 (2007), 55-73.

M. V. Subramony, “The Relationship of Performance Feedback and Service-Learning,” MichiganJournal of Community Service Learning 7(2000): 46-53.

G. Whitcomb, “Academic Service-Learning in Christian Higher Education: A Critical Assessment,” Association of Christians in Student Development National Conference 2007, workshop and paper.

Q. Zhang, “Teacher Misbehaviors as Learning Demotivators in College Classrooms: A Cross-Cultural Investigation in China, Germany, Japan, and the United States,” Communication Education, 56:2 (2007), 209-227.


Recommended