+ All Categories
Home > Documents > Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative...

Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative...

Date post: 25-Sep-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
10
Assessment as a foundational structure for democratic culture Share authority Concentrate authority Challenge Norms Work within established norms Be efficient Be efficacious Be pragmatic in the world Realize ideals in the world We nudge ourselves toward Axes of tensions experienced by enacting democratic values in the world Values negotiated Self In relationship with diverse others Within/across institutions and systems D T E N S I O N S I A L E C T I C L A Assessment as/for civic agency, efficacy, solidarity, learning and deliberative decision-making & healing N E I G T O A T E O A E S T H R N E M T S S S U G H Democratically-Engaged Assessment invites us to situate assessment as one of what H. Giroux (2013) calls “laboratories for the expression of the civic imagination.” a better world Reclaiming assessment as an act of resistance Last update 1/22/2018
Transcript
Page 1: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

Assessment as a foundational structure for democratic culture

Share authority

Concentrate authority

Challenge Norms

Work within established norms

Be efficient Be efficacious

Be pragmatic in the world

Realize ideals in the world

We nudge ourselves tow

ard

Axes of tensions experienced by

enacting democratic values in the world

Values negotiated

Self

In relationship with diverse others

Within/across institutions and systems D

T E N S I O

N S I

A L E

C T I C L A

Assessment as/for civic agency, efficacy, solidarity, learning and deliberative decision-making & healing

N E

I

G

T O

A T

E

O A

E S

T H R

N E

M

T

S

S S

U G H

Democratically-Engaged Assessment invites us to

situate assessment as one of what H. Giroux (2013) calls

“laboratories for the expression of the

civic imagination.”

a better world

Reclaiming assessment as an act of resistance

Last update 1/22/2018

Page 2: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

[Imagining America’s Assessing the Practices of Public Scholarship (APPS) working group, including co-authors Joe Bandy, Anna Bartel, Sylvia Gale ([email protected]), Georgia Nigro, Mary F. Price ([email protected]), & Sarah Stanlick,

with Patti H. Clayton]

Democratically Engaged Assessment: Re-Imagining the Purposes and Practices of Assessment in Community Engagement

Starting points:

x Assessment is always undergirded by values x It is most important to ask “which values?” and “who determines them?” x It matters whether we default to a set of values, let ourselves be pressured into alignment with others’

values, or deliberately choose the values around which we build assessment (and other) practices Grounding framework: Democratic Engagement (In many ways we live with one foot in each of these paradigms, which intermingle and push/pull against each other in varied subtle and overt ways throughout many domains of life).

Technocratic Engagement Democratic Engagement

For With

Deficit-based Asset-based

Uni-directional flow of knowledge from credentialed academic experts

Multi-directional flow of ideas and questions within a web of knowledge and practice centers

Often transactional exchanges Potentially transformative (of self, others, organizations/institutions, systems, paradigms) partnerships

Hierarchical power dynamics Power-shifted dynamics that disrupt hierarchy and position all partners as co-educators, co-learners, and co-creators of knowledge and practice

[excerpted and modified by Clayton from Saltmarsh, Hartley, & Clayton, 2009] Central question: How might we better walk the talk of the values of democratic engagement in our assessment work, navigating constraints and inevitable tension points while empowering all participants? Democratically Engaged Assessment (DEA): assessment that is explicitly grounded in, informed by, and in dialogue with the (contested) values of democratic engagement Why democratically engaged assessment?

x Borrowing from Giroux (2013), we might say that engaging [democratic] values in assessment is an act of “civic imagination,” an act that requires us to reimagine assessment as a cultural practice through which we can take “seriously the demands of justice, equity, and civic courage” (pp. 16-17) and thereby begin to transform our organizations, our communities, and ourselves. If we understand DEA as an imaginative act, then we also recognize it to be always also emerging, developing, re-forming. We do not master it so much as we practice it, along the way developing the skills and knowledge that are “central to democratic forms of education, engagement, and agency” (p. 16).

Page 3: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

[Imagining America’s Assessing the Practices of Public Scholarship (APPS) working group, including co-authors Joe Bandy, Anna Bartel, Sylvia Gale ([email protected]), Georgia Nigro, Mary F. Price ([email protected]), & Sarah Stanlick,

with Patti H. Clayton]

Core values of democratically engaged assessment (always and already subject to negotiation, interrogation, and critique in the times, places, and social contexts of their application):

● Full Participation – Full participation refers to the creation of assessment processes that, “enable people, whatever their identity, background, or institutional position, to thrive, realize their capabilities, engage meaningfully” in assessment, “and contribute to the flourishing of others.” (Sturm 2006, 2010 in Sturm, Eatman, Saltmarsh, & Bush, 2011, p. 3). This requires critical attention to and work to cultivate the conditions and practices (organizational, institutional, economic, political, cultural, physical, communicative, etc.) of assessment that enable all participants and their perspectives to be included, respected, valued, and supported.

● Co-creation -- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes to be sure, but more transformationally, also strive to develop highly collaborative approaches to gathering and producing knowledge and developing and refining practice. Here, the ideals and processes of intentional power-sharing, the development of trust, the generous use of participant assets and capabilities, and the deliberative negotiation of difference are crucial in defining processes and products.

● Generativity -- Generative processes (including of assessment) open up rather than close possibilities

for meaning making and growth and embrace a sense of wonder about the world and its transformation. It focuses on the creation of useful knowledge that can feed dynamic and ongoing processes of inquiry and discovery for courses, programs, institutions, communities, and society as a whole.

● Rigor (intellectual and ethical) -- Rigor involves a critical orientation toward work done in

partnership, in which many stakeholders come together to listen, question, evaluate, and inform. Implicit in the word rigor is a degree of disciplining or adherence to a process of inquiry that leads those stakeholders to trust and recognize as authentic a process or piece of evidence. Beyond critical thinking about methods, rigor as a value prompts us to question objectivity, assumptions, and the wisdom of “what is” with the opportunities of “what might be.” Rigor is not taken to be an inflexible or rigid constraint that norms or standardizes inquiry; rather, it encourages critical examination of our work to ensure that commitments and intentions are being realized with integrity.

● Practicability -- Practicable assessment is grounded in the realities of the world as it is. That is, for

assessment to be effective it must be feasible, and thus it must be mindful of the constraints and opportunities inherent within available human or fiscal resources, organizational imperatives, and social systems. This does not suggest that assessment or the community engagement projects it serves should not challenge and seek to overcome limitations, but that it must be mindful of the conditions that shape it and intentional in navigating them.

● Resilience -- Resilient partnerships and projects are durable over time and have the longevity

necessary for transformative impacts as well as the assessment work that help make them possible. This longevity requires that participants nurture their relationships so that they both can withstand the inevitable pressures and disruptions of social forces large and small and rebound from adversity in ways that promote growth and just change in their organizations, communities, and society.

Page 4: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

45

Table 6: DEA Questions to Pose in Examining an Assessment Tool through the Lens of DEA

DEA core value Questions to pose in examining an assessment tool through the lens of DEA

Full Participation

● Who was and was not involved in developing it? Who decided who was involved? Were the perspectives of all stakeholders taken into account in deciding?

● Who is and is not involved in selecting it? Who decides who was involved? Are the perspectives of all stakeholders taken into account in deciding?

● Who is and is not involved in implementing it? Who decides who is involved? Are the perspectives of all stakeholders taken into account in deciding?

● Who is and is not involved in making meaning of the information it generates? Who decides who is involved? Are the perspectives of all stakeholders taken into account in deciding?

● Who is and is not involved in reporting and sharing the process and the results? Who decides who is involved? Are the perspectives of all stakeholders taken into account in deciding?

● Who is and is not involved in using what is learned? Who decides who is involved? Are the perspectives of all stakeholders taken into account in deciding?

● Who has ready access to it? In other words, who knows it exists or is able to find it easily? ● Are the data it collects relevant to all participants? ● Is the format of the tool such that all relevant collaborators and stakeholders can make use of it? What accommodations or

supports will need to be put in place to enable all parties to use the tool? ● Is the language of the tool familiar to all participants? If not, how might we minimize jargon and otherwise unfamiliar terms? ● To what extent is the tool representative of participant perspectives? ● Will the tool generate data that is understandable and applicable by all?

Co-creation Questions above for “Full Participation” plus: ● Who contributed to the knowledge base it was developed from (and who did not)? ● Do the users feel empowered to change it? Is there an open and transparent process through which changes are proposed,

discussed, decided upon, and implemented? ● How are participants working together -- communicating, deliberating, raising questions and concerns, etc. -- throughout

development, selection, implementation, meaning making, reporting, and using results? ● Are we using the tool in ways that intend mutual benefits for all partners, or, if not, that intend benefits as all have agreed

upon? ● Does the tool exhibit reciprocity (i.e., “recognition, respect, and valuing of the knowledge, perspective, and resources that

each partner contributes to the collaboration” (UNCG)

Generativity ● Does the tool help to identify strengths and assets (e.g., in project or program development, in partners’ understanding and growth, in partnership processes)?

● Does the tool help users see what processes and outcomes are constructive of future projects and partnerships? ● Why is the tool being used? Toward what ends?

Page 5: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

46

● Is the tool being used in a way that will open up new possibilities for practice and inquiry? ● Is the tool being used in ways that build up individuals and organizations/institutions (e.g., their confidence, their capacities,

their commitments) or in ways that tear them down or diminish them? ● Does the tool and how it is being used promote critical reflection on ourselves, our work together, our goals, etc.? Does it help

us see and question our assumptions, goals, etc.? Does it support critique that in turn supports growth and development of people, organizations, and processes?

● Does what we can learn from it transcend the immediate question and point to broader meanings and possibilities? ● Are we using the tool and what we learn from it to reinforce the status quo or to invite shifts in perspective, practice, and

identity?

Rigor (intellectual and ethical)

● Are all the sources that informed its development acknowledged? ● Whose norms/standards are being used to establish whether the tool is sufficiently rigorous? ● Who owns the version of the tool that was used, if it was modified in the process? The findings the tool generated? Reports and

other documentation materials? Any recommendations or new understandings produced? ● Does the tool enable a thorough investigation into relevant questions and information? ● Is the tool valid? Who decides criteria for validity? ● Does the tool generate information that is useful? Reliable? Trustworthy? Relevant? ● Does the tool allow us to see change over time? ● Are we making meaning of the information the tools helps us gather in ways that are appropriate for our goals, questions, and

context? ● Does anyone have to compromise their integrity in order to participate in use of the tool? ● Does the tool have clearly defined terminology? Does the language of the tool further enshrine problematic assumptions or

interpretations? ● Is the tool grounded in conceptual clarity? ● Has the tool been tested (e.g., for usability, for accessibility, for analyze-ability of the information it generates)? ● Does the tool allow information to be gathered in more than one way? If not, what other tools might we use to supplement

it? ● Are we using the tool in ways that are respectful of all participants in the process? ● Is anyone harmed by our use of the tool? ● Is there any actual or perceived risk to any participants in using it? Are potential risks made visible so that potential

participants may make an informed choice about participating? ● Does the tool and our use of it adhere to the review standards of all participating organizations/institutions? ● Does it enable us to ask about who does and not benefit by our processes?

Practicability ● What trade-offs are involved in using it? Who has the authority to decide which cost or benefit trade-offs will be made, and who has responsibility for the consequence of the trade-offs?

● Does the tool accommodate the constraints of the context in which it will be be used? Does it challenge the constraints in a productive, strategic way?

● What resources are required for its implementation, and do these conform to the assets of the participants? If not, can we modify how we implement it, or can we bring in additional assets? How might resources be leveraged to enable us to use this

Page 6: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

47

tool? ● Are the social conditions (organizational, cultural, political, economic) suitable for the use of the tool? If not, how may the

tool be modified or implemented appropriately? Or, how might social conditions be altered to support the use of the tool? ● What benefits does use of the tool offer? Are these greater than the costs? How can we be expansive in defining all potential

benefits and costs, so that we can properly assess the tool’s potential? ● What decision principles other than cost-benefit analysis is it important for us to bring to bear in determining practicability?

Resilience ● Can the tool be used repeatedly over time? ● Can the tool be used in longitudinal models of inquiry (e.g., into project or program development, into partners’ understanding

and growth, into partnership processes)?? ● Will the tool generate data that is useful for long-term planning? ● Can the tool provide insights that lead to adaptation and transformation (of ourselves, our work, our society)? ● Will the tool help to overcome central challenges to student learning, faculty development, institutional growth, partnerships,

or community or social change? ● Will the tool help identify assets that can support participants’ development for long-term transformation?

Page 7: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

[Imagining America’s Assessing the Practices of Public Scholarship (APPS) working group, including co-authors Joe Bandy, Anna Bartel, Sylvia Gale ([email protected]), Georgia Nigro, Mary F. Price ([email protected]), & Sarah Stanlick,

with Patti H. Clayton]

Resources for Assessment in Community Engagement [APPENDIX in Democratically Engaged Assessment:

Re-Imagining the Purposes and Practices of Assessment in Community Engagement]

Focus of Assessment Key References Methodological Tools

Student intellectual development (content knowledge, critical thinking, problem solving)

Ash, Clayton, & Atkinson, 2005 Astin & Sax, 1998 Astin, Vogelgesang, Ikeda, & Yee, 2000 Eyler & Giles, 1999; Eyler, Giles, Stenson, & Gray 2001 Fitch, Steinke, & Hudson, 2013 Jameson, Clayton, & Ash, 2013

Reflective Judgment Model (King & Kitchener, 2002) The Measure of Service Learning: Research Scales to Assess Student Experiences (Bringle, Phillips, & Hudson, 2004) Critical Reflection (Ash & Clayton, 2009) VALUE Rubrics (e.g., Problem-Solving, Intercultural Knowledge and Competence, Critical Thinking) (AAC&U, 2009) Problem-Solving Analysis Protocol (P-SAP) (Fitch, Steinke, & Hudson, 2013)

Student personal development (e.g., moral, emotional, social, efficacy, empathy, identity development, motivation, agency, career)

Astin, Vogelgesang, Ikeda, & Yee, 2000 Astin & Sax, 1998 Brandenberger, 2013 Eyler & Giles, 1999 Eyler, Giles Stenson, & Gray, 2001 Felten, Gilchrist, & Darby, 2006 Jones & Abes, 2004 Kahne & Westheimer, 2006 Lundy, 2007

Community Service Self-Efficacy Scale (Reeb, Katsuyama, Sammon, & Yoder, 1998) Faith and Civic Engagement (FACE) Scale (Droege & Ferrari, 2012) Revised Empathic Anger Scale (Bringle, Hedgepath, & Stephens, 2015)

Student civic outcomes (including civic responsibility, intercultural competence, social justice orientation)

Battistoni, 2013 Deardorff & Edwards, 2013 Eyler, 2010 Eyler & Giles, 1999

Civic Responsibility Survey (Furco, Muller, & Ammon, 1998) Civic Attitudes and Skills Questionnaire (CASQ) (e.g., Moely, Miron, Mercer, & Ilustre, 2002) Civic Engagement VALUE Rubric (AAC&U, 2009)

Page 8: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

[Imagining America’s Assessing the Practices of Public Scholarship (APPS) working group, including co-authors Joe Bandy, Anna Bartel, Sylvia Gale ([email protected]), Georgia Nigro, Mary F. Price ([email protected]), & Sarah Stanlick,

with Patti H. Clayton]

Focus of Assessment Key References Methodological Tools

Hatcher, Bringle, & Hahn, 2017 Kiely, 2005 Pascarella & Terenzini, 2005 Saltmarsh, 2005 Steinberg, Hatcher, & Bringle, 2011 Stokamer, 2011 Yorio & Ye, 2012

Civic-Minded Graduate Scale, quantitative and qualitative (Steinberg, Hatcher, & Bringle, 2011) Social Justice Scale (SJS) (Torres-Harding, Siers, & Olson, 2012) Civic Engagement Scale (Doolittle, 2013) Awareness of Privilege and Oppression Scale II (McClellan, 2014)

Faculty (learning, motivation, agency, development, teaching, scholarship)

Blanchard, Hanssmann, Strauss, Belliard, Krichbaum, Waters, & Seifer, 2009 Blanchard, Strauss, & Webb, 2012 Clayton, Hess, Jaeger, Jameson, & McGuire, 2013 Chism, Palmer, & Price, 2013 Cruz, Ellern, Ford, Moss, & White, 2012 Doberneck & Fitzgerald, 2008 Foster, 2012 O’Meara, 2013 Seifer, Blanchard, Jordan, Gelmon, & McGinley, 2012

Scholarship Reconsidered (Boyer, 1990) Points of Distinction: A Guide for Planning and Evaluating Quality Outreach (Michigan State University, 1996, 2000) Scholarship Unbound (O’Meara, 2001) Community-Engaged Scholarship Review, Promotion, & Tenure Package (Jordan, 2007) Scholarship in Public (Ellison & Eatman, 2008) Service-Learning Quality Assessment Tool (SLQAT) (Furco et al., 2017) Holistic Framework for Educational Professional Development (Welch & Plaxton-Moore, in press)

Institutions of higher education

Franz, Childers, & Sanderlin, 2012 Furco & Holland, 2013 Furco & Miller, 2009 Kecskes, 2013 Moore & Ward, 2010 Saltmarsh & Gelmon, 2006

Carnegie Community Engagement Classification Self-Assessment Rubric for the Institutionalization of Service Learning in Higher Education (Furco, 1999) The Engaged Department Toolkit (Battistoni, Gelmon, Saltmarsh, Wegin, & Zlotkowski, 2003) Building Capacity for Community Engagement: Institutional Self-Assessment (Gelmon, Seifer, Kauper-Brown, & Mikkelsen, 2005)

Page 9: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

[Imagining America’s Assessing the Practices of Public Scholarship (APPS) working group, including co-authors Joe Bandy, Anna Bartel, Sylvia Gale ([email protected]), Georgia Nigro, Mary F. Price ([email protected]), & Sarah Stanlick,

with Patti H. Clayton]

Focus of Assessment Key References Methodological Tools

Saltmarsh, Clayton, & Janke, 2016 Saltmarsh, Hartley, & Clayton, 2009 Sandmann & Plater, 2009 Sandmann & Plater, 2013 Warnick, 2007

Creating Community-Engaged Departments: Self-Assessment Rubric for the Institutionalization of Community Engagement in Academic Departments (Kecskes, 2008) Rubric for the Community-Engaged School/College (Janke et al., 2017)

Partnerships Bringle & Clayton, 2013 Burns, 1978 Community-Campus Partnerships for Health, 2007 Creighton, 2006 Cunningham, 2008 d’Arlach, Sanchez, & Feuer 2009 Driscoll & Kecskes, 2009 Dumlao & Janke 2012 Holland, 2004 Jacoby, 2013 Janke, 2013 Koch, 2005 Miron & Moely, 2006 Mondloch, 2009 Sandy & Holland, 2006 Stoecker & Tryon, 2009 Ward & Wolf-Wendel, 2000 Weiss, Anderson, & Lasker, 2002 Wolff, Greene, & White, 2012

Building Partnerships with College Campuses: Community Perspectives (Leiderman, Furco, Zapf, & Goss, 2002) CHESP Assessment Framework (Gelmon, 2003) Community Impacts of Research Oriented Partnerships (CIROP) (King, Currie, Smith, Servais, & McDougall, 2008) Students, Organizations, Faculty, Administrators, Residents (SOFAR) (Bringle, Clayton, & Price, 2009) Partnership Assessment Toolkit (Afsana, Habte, Hatfield, Murphy, & Neufeld, 2009) Democratic/Technocratic Paradigms of Engagement (Dostilio & Clayton, 2011; Saltmarsh, Hartley, Clayton, 2009) TRES (Clayton, Bringle, Senor, Huq, & Morrison, 2010) Principles of Partnership (CCPH Board of Directors. Position Statement on Authentic Partnerships. Community-Campus Partnerships for Health, 2013) Community Engagement Partnership Rubric (Mack, Brotzman, & Deegan, 2014) Service-Learning Quality Assessment Tool (SLQAT) (Furco et al., 2017) Collaborative Relationship Mapping [ColRM] (Price, 2017) Living Knowledge’s science shops toolkit

Page 10: Assessment as a foundational structure for democratic culture...Co-creation-- Co-creative relationships among participants in assessment embrace the goals of mutually beneficial outcomes

[Imagining America’s Assessing the Practices of Public Scholarship (APPS) working group, including co-authors Joe Bandy, Anna Bartel, Sylvia Gale ([email protected]), Georgia Nigro, Mary F. Price ([email protected]), & Sarah Stanlick,

with Patti H. Clayton]

Focus of Assessment Key References Methodological Tools

Community Outcomes

Cruz & Giles, 2000 Gelmon, Holland, Driscoll, Spring & Kerrigan, 2001 Guijt 2007 Hart, Northmore, & Gerhardt, 2009 Marullo et al., 2003 Olney, Livingston, Fisch, & Talamantes, 2006 Reeb & Folger, 2013 Reeler, 2007 Schmidt & Robby, 2002 Worrall, 2007

Community Capitals Framework (Emery & Flora, 2006; Mattos, 2015) Typology of Community Outcomes (Gemmel & Clayton, 2009) Whole Measures (Center for Whole Communities) Continuum of Impact (Animating Democracy) Collective Impact Framework (Kania & Kramer, 2011) Ripple Effect Mapping (Kollock, Flage, Chazdon, Paine, & Higgins, 2012) Outcomes Harvesting (Wilson-Grau, 2015) Fidelity criteria monitoring and sustainability checks (Reeb et al. forthcoming)


Recommended