Date post: | 30-Dec-2015 |
Category: |
Documents |
Upload: | catherine-richards |
View: | 213 times |
Download: | 0 times |
Setting And Communicating Assessment Vision And Expectations:
Discussing Barriers And Solutions
Joni E. Spurlin, Ph.D.
University Director of Assessment
NC State University
NC State Undergraduate Assessment Symposium April, 2008
Session Outcomes
The goal of this session is to engage participants in a thoughtful dialogue.
By the end of this session, participants will be able to: Define why a vision for assessment is useful; Define assessment expectations; and Discuss barriers and solutions to barriers
related to communicating vision and expectations of assessment.
NC State Undergraduate Assessment Symposium April, 2008
What is New or Different about this Session?
Probably not much From Psychological viewpoint – if there
are “problems” – first have to identify the issues, then talk about these issues and together find solutions….
Use for your institution…from an angle you might not have considered
Help synthesize some of the ideas you may have heard over the past two days
NC State Undergraduate Assessment Symposium April, 2008
Doing Assessment As If Learning Matters Most by Thomas A. Angelo
“Thus, in order to move beyond piecemeal and superficial change and toward transformation, we need to develop a learning community-like culture among the faculty and administrators involved in assessment. Four basic preconditions are key to this collective personal mastery. First, we need to develop shared trust; second, shared visions and goals; and third, shared language and concepts. Fourth, we need to identify research-based guidelines that can orient our assessment efforts toward the goal of creating productive learning communities.” AAHE May 1999: http://education.gsu.edu/ctl/outcomes/Doing%20Assessment%20As%20If%20Learning%20Matters%20Most.htm
NC State Undergraduate Assessment Symposium April, 2008
Topics
Vision Of assessment
process
Barriers Some Solutions Examples Exercises
Expectations Of assessment
process Of student
learning Barriers Some Solutions Examples Exercises
NC State Undergraduate Assessment Symposium April, 2008
Vision is…
Picture of the future Inspirational Framework for planning Dreams & hopes balanced with reality
Feasible, attainable
“If strategic plan is a ‘blueprint’ for an organization’s work, the vision is the ‘artist’s rendering’ of the achievement of that plan.”
…scary …intimidating
NC State Undergraduate Assessment Symposium April, 2008
Group Discussion
Why develop a “SHARED VISION” of Assessment Process within your institution?
NC State Undergraduate Assessment Symposium April, 2008
Why Develop a “SHARED VISION” of Assessment Process? Increases the sense of shared responsibility for
student learning Focuses direction for next few years Pulls our sight above the day-to-day aspects of
our work Stretches us beyond our status quo… stretches
expectations, aspirations, and performance Provides an “outcome”…against which to plan
and assess
NC State Undergraduate Assessment Symposium April, 2008
Group Discussion
In your institution, what are barriers to developing a VISION?
NC State Undergraduate Assessment Symposium April, 2008
Barriers
Differences in how assessment is valued. Lack of understanding and lack of clear and
agreed-upon definitions of accreditation, assessment, and accountability, including differences and relationships between.
Differing attitudes, knowledge, and competing needs of resources
Expectations by different parties collide: administration, faculty, students, assessment professionals, internal and external needs
NC State Undergraduate Assessment Symposium April, 2008
Exercise
Q: How Overcome Barriers?
Worksheet – Examining list of barriers on previous slide - pick one and list a way to overcome this barrier
5 minutes
NC State Undergraduate Assessment Symposium April, 2008
Overcome Barriers
Defining what you are talking about Defining and developing consensus about
purposes of assessment, assessment outcomes, use of assessment findings for informing: Individual students Academic or student affairs programs College or division Institution External agencies
NC State Undergraduate Assessment Symposium April, 2008
Examples of Vision Statements
Continuous, ongoing assessment that is fully integrated into all units of the university
A campus culture that fully participates in and uses information from assessment initiatives to make decisions
A process that matures and grows - Assessment aligns with what faculty/staff are doing and why they are doing what they are doing
Process defines measurable learning outcomes which help reframe departments’ thinking
Annually celebrate process, success, failure
NC State Undergraduate Assessment Symposium April, 2008
Other Vision Examples
The handout has examples from other institutions: American Association for Higher Education NC State University University of Wyoming Millersville University Towson University Indiana State University
NC State Undergraduate Assessment Symposium April, 2008
Example of Definitions
NC State University: Common Language: http://www.ncsu.edu/uap/academic-standards/uapr/process/language.html
James Madison University: Dictionary of Student Outcome Assessment: http://people.jmu.edu/yangsx/
Western Kentucky University: http://www.wku.edu/sacs/assessmentmanual.htm
“Glossaries” available on the Internet Resources list http://www2.acs.ncsu.edu/UPA/assmt/resource.htm.
Adding to our vocabulary: Transparency Evidence of student learning
NC State Undergraduate Assessment Symposium April, 2008
Exercises for your institution
See “Exercises” in the handout
NC State Undergraduate Assessment Symposium April, 2008
CHEA’s View on Expectations:
“Institutions and programs are responsible for establishing clear statements of student learning outcomes and for collecting, interpreting, and using evidence of student achievement.”
“Institutions and programs share responsibility with accrediting organizations for providing clear and credible information to constituents about what students learn.”from: Statement Of Mutual Responsibilities For Student Learning Outcomes: Accreditation, Institutions, And Programs: http://www.chea.org/pdf/stmntstudentlearningoutcomes9-03.pdf
NC State Undergraduate Assessment Symposium April, 2008
Q: What are YOUR Expectations?
List expectations you have about assessment process and student learning at your institution.
What do we expect to accomplish through assessment?
“What” and “how well” do you expect students to learn?
NC State Undergraduate Assessment Symposium April, 2008
Re-Visiting Expectations
Why revisit? When/how often revisit? When something changes When processes are failing
NC State Undergraduate Assessment Symposium April, 2008
What and How Well are our students learning?Programmatic
accreditation standards/criteria
Regional accreditation standards/criteria
Accountability pressures from federal and state
Institutional decision-making
Program & curricular decision- making
Barriers:oAssessment process designed for specific purpose or level; then changes occur; such as motivations, internal needs, external pressures
oLack of trust of motives/ use of assessment.
oFear of resistance and hostility.
oMiscommunication, especially because of differences in perspectives, expectations and values.
oFatigue: We all get tired of conversations about assessment, accreditation, accountability.
oFatigue: Does one process fit all?
oDespite communication, some don’t or won’t hear.
oLack of assessment momentum after an accrediting body leaves.
Multiple Focuses
NC State Undergraduate Assessment Symposium April, 2008
Statement of Good Practice* Suskie
Expectations are discussed at program and institutional levels so that expectations are congruent and clear to all engaged in assessment processes.
Faculty, assessment professionals and leadership discuss ramification of programmatic accreditation, regional accreditation and accountability issues.
The institution promotes an atmosphere of critical reflection about teaching, learning, research and services.
Assessment reflects what stakeholders really care about. Assessment evidence is publicly available, visible and consistent.
*For a summary, see Linda Suskie’s compilation: “What is good assessment: A synthesis of Principles of Good Practice”: From “What is ‘good’ assessment? A new model for fulfilling accreditation expectations,” presented at the First Annual International Assessment & Retention Conference, Phoenix AZ, June, 2006. See Internet Resources for Higher Education Outcomes Assessment website: http://www2.acs.ncsu.edu/UPA/assmt/resource.htm
NC State Undergraduate Assessment Symposium April, 2008
What and How Well are our students learning?
Programmatic accreditation standards/criteria
Regional accreditation standards/ criteria
Accountability pressures from federal and state
Institutional decision-making
Program & curricular decision- making
Exercise: What Issues Lead to Barriers?
NC State Undergraduate Assessment Symposium April, 2008
IssuesProgram Level Institutional Level
Purpose Formative: program improvement Summative: what can we say about student learning for entire institution?
Fears Leads to fear of sharing results with decision makers
Fears related to the inability to meet external pressures (accreditation, accountability)
Are there needs for multiple processes because of multiple audiences?
Outcomes:
Relationship of program outcomes to institutional “outcomes”, goals, mission
Isn’t having program outcomes for every program enough? Do we need institutional outcomes?
Intuitional outcomes: Should we have? How define? Who defines? Are these just “general” outcomes? Do these map to program outcomes? (Is having institutional outcomes enough, why have program outcomes?)
Design of Assessment
Design of assessment process may NOT be applicable to “roll up” for institutional reporting.
More difficult to use institutional results for program improvement
Design of assessment more difficult: how obtain data applicable for institutional summaries? CLA, NSEE, portfolio
Cost & Time Defining the balance of cost/time to value of evidence for the program.
Faculty & students more willing to participate for their own program
Institutional Evidence Gathering is Very Costly:
CLA: $6500 + time
NSEE: $6000-7000 + time
Student motivation lower
Portfolio = more faculty time
NC State Undergraduate Assessment Symposium April, 2008
Issue: Purpose of Assessment Program Level Institutional Level
Purpose Formative: program improvement
Summative: what can we say about student learning for entire institution?
Solutions:Discuss faculty expectations, including workload and purposes of processes. Review programmatic and regional accreditation expectations.Identify what is needed for making program and curricular decisions.Set expectation that program assessment results will be used at institutional level (and/or visa versa)
Examples:The University of Montana http://www.umt.edu/provost/pdf/academicassessmentplan2006.pdfFrostburg State University: http://www.frostburg.edu/admin/apie/ModelPlanningGroupAssessmentPlan08Mar06.pdfUniversity of Nevada, Reno: http://www.unr.edu/assess/model/index.html
Workload:NCSU faculty survey results: 25% said they spent time on academic program assessment activities.California faculty workload study: 7% of faculty spent time on assessment.
NC State Undergraduate Assessment Symposium April, 2008
Issue: FearsProgram Level Institutional Level
Fears Leads to fear of sharing results with decision makers
Fears related to the inability to meet external pressures (accreditation, accountability)
Are there needs for multiple processes because of multiple audiences?
Solutions:Fears have been raised because of Spellings Commission Report and Voluntary System of Accountability Format (VSA). For VSA, results of student learning outcomes are transparent. Increases discussions about use of evidence, reporting integrity, audience.Collective group who take responsibility for student learning, develop methodology of data collection and work with leadership in HOW data will be used, including confidence in validity of data collection methods.Reporting, documentation also includes the limits of the methodology. Clearly state the appropriate uses of the data and the methodology limits.Institutional methodology should be as useful for as many audiences as possible, with clearly and timely documentation of results, appropriate uses of results and issues surrounding data collection.
Examples:VSA: As of 4/19/2008: 235 institutions signed up. http://www.voluntarysystem.org/index.cfm?page=templatesCASE Western: http://www.case.edu/provost/uplan/Documents/Assessment%20Task%20Force_final2.pdf
NC State Undergraduate Assessment Symposium April, 2008
Issue: OutcomesProgram Level Institutional Level
Outcomes:
Relationship of program outcomes to institutional outcomes
Isn’t having program outcomes for every program enough? Do we need institutional outcomes?
Intuitional outcomes: Should we have? How define? Who defines? Are these just “general” outcomes? Do these map to program outcomes? (Is having institutional outcomes enough, why have program outcomes?)
Solutions (in addition to issues already discussed): Institution decides what type and level of outcomes to develop, besides program outcomes, e.g. for general education/core curriculum, college –wide, institutionalFocus Groups with students and faculty: What do students expect to learn? What do faculty expect students to learn? What do faculty expect ALL students to learn? Alumni (one to three years after graduation) have great perspective on what they did learn and what they wished they had learned. Use their insight when reviewing or developing program level outcomes and institutional level outcomes.The student experience = comprehensive view including both experiences inside and outside the classroom (combining curricular and co-curricular assessment processes)Electronic assessment management tools –can help map program and institutional outcomes and goals
Linked Program and Institutional Outcomes, Examples:Dickinson State University: http://www.dsu.nodak.edu/Catalog/fine_arts/art_majors_minors.htmSt. Olaf College: http://www.stolaf.edu/committees/curriculum/media/PendingMatters/CallForProgramILOs.pdfPierce College: http://www.lapc.cc.ca.us/offices/preview/ILOs.doc
Institutional Outcomes Examples:University of Nebraska Omaha: http://www.unomaha.edu/assessment/index.phpWorcester Polytechnic Institute: http://www.wpi.edu/Academics/Outcomes/UOAC%20assess%20plan%20April%202005%20rev%20100405.pdf
NC State Undergraduate Assessment Symposium April, 2008
Issue: Design of Assessment Program Level Institutional Level
Design of Assessment
Design of assessment process may NOT be applicable to “roll up” for institutional reporting.
More difficult to use institutional results for program improvement
Design of assessment more difficult: how obtain data applicable for institutional summaries? CLA, NSEE, portfolio
Solutions:Institutional level: Identify what administrators need, how often, and why.Discuss how program assessment results can be used (and should not be used) for decision making at institutional level; discuss how institutional results can be used (and should not be used) for decision making at program level. Key words at NCSU: meaningful, manageable, effective, efficientEffective assessment = “assessment-as-learning”: assessment process increases student learning, e.g., portfolio feedback, reflections, self-assessment, peer-assessment, etcInstitutional level assessment committee to coordinate, prioritize, and use assessment data to improve student learning and for institutional improvement purposes. (comprised of key faculty members, senior academic and student affairs administrators, students, and staff)Consider pros and cons of designing centralized vs. decentralized assessment processes
Examples:University of Colorado at Boulder: http://www.colorado.edu/pba/outcomes/handbook.htmUniversity of Maine: http://www.umaine.edu/provost/committees/SLOA/index.htm
NC State Undergraduate Assessment Symposium April, 2008
Issue: ResourcesProgram Level Institutional Level
Cost & Time Defining the balance of cost/time to value of evidence for the program.
Faculty & students more willing to participate for their own program
Institutional Evidence Gathering is Very Costly:
CLA: $6500 + time
NSEE: $6000-7000 + time
Student motivation lower
Portfolio = more faculty time
Solutions:Embrace concept that assessment takes time. Time to develop or review program outcomes: up to one year; Time to develop or review institutional outcomes: 3-5 yearsDetermine costs for each assessment design, then make priorities for use of resources… balance the value of the assessment evidence with the effort or resources involved in gathering the evidence. Determine what is required.Weigh importance and requirements for benchmarking with other institution using nationally developed instruments.
Examples:Assessment, Accountability, And Student Learning Outcomes by Richard Frye: http://www.ac.wwu.edu/~dialogue/issue2.htmlNew Directions for Community Colleges: Volume 2004 Issue 126 , Pages 1 - 109 (Summer 2004): http://www3.interscience.wiley.com/journal/109580268/issueAssessment Practice in Student Affairs: An Applications Manual by John H. Schuh, M. Lee Upcraft: http://www.josseybass.com/WileyCDA/WileyTitle/productCd-078795053X.html
NC State Undergraduate Assessment Symposium April, 2008
Exercises for your institution
See “Exercises” in the handout
NC State Undergraduate Assessment Symposium April, 2008
Final slide!
Communicate, communicate, communicate Vision and expectations about assessment
processes Vision and expectations about what students
should be learning How well students are learning (increase
transparency of evidence – across programs, internal and external to institution)
NC State Undergraduate Assessment Symposium April, 2008
Contact Information
Joni E. Spurlin, Ph.D.University Director of Assessment
University Planning and Analysis
Campus Box 7002
NC State University
Raleigh, NC 27695
919-515-6209
http://www2.acs.ncsu.edu/UPA/assmt/index.htm