Assessing Institutional Effectiveness Through the Electronic Portfolio

Post on 10-Jan-2016

21 views 0 download

Tags:

description

Assessing Institutional Effectiveness Through the Electronic Portfolio. Lynda Barner West, Ed. D. Associate Vice President – Technology and Information Services Andrea Beranek, M.A. Assistant to the Vice President for Academic Affairs, Program Support. Institutional Background. - PowerPoint PPT Presentation

transcript

Assessing Institutional Effectiveness Through the

Electronic Portfolio

Lynda Barner West, Ed. D.Associate Vice President – Technology and Information Services

Andrea Beranek, M.A.Assistant to the Vice President for Academic Affairs, Program Support

Institutional Background• Roman Catholic – Sisters of Mercy• Urban Campus

– Oakland Area of Pittsburgh Pennsylvania– 81% undergraduates commute– 90,000 college students in region

• Private Master’s Colleges and Universities I• 2000 Students

– 80% Undergraduate– 95% Women at the Undergraduate Level– 22% minority– 55% First Generation College

Why Assessment Issues Are at the Forefront Now?

• New Core Curriculum Implementation– Fall 2003 implementation of new

undergraduate Core Curriculum

• Middle States Standards– Assessment Plan

• Shift from strongly recommended to required

– Student learning outcomes

Charge from VPAA

• Systematic assessment of student learning is essential to monitoring quality and providing information that leads to improvement

• Develop a workable plan that addresses individual student and cohort learning outcomes at institutional, program, and individual course levels

Beyond Data Collection

Sub Committee

• Faculty– Director of Core Curriculum Implementation

– Chair of Curriculum Committee

– Chairs of Skills Integration Sub-committees• Writing, research, technology, quantitative reasoning,

speaking

• Academic Affairs Staff– AVP Technology and Information Services

– Ast to VP Academic Affairs

The Assessment Puzzle

Individual Growth

Core Effectiveness

Major Sequence

Institutional Effectiveness

Four Levels• Individual Student

– Baseline– Progress Through the Curriculum– Performance at Exit

• Core Curriculum– Skills– Concepts

• Program and Major– Skills– Concepts– Sequences

• Institutional Level– Overall Effectiveness

Levels Across Time

Individual

Core

Program

Institution

Entry Baseline End First Year End Second Year End Third Year End Fourth Year

CATE

Major CourseCore

Career Literacy

Critical Reasoning

Quantitative ReasoningResearch Service Learning

SpeakingTechnologyWriting

Scores

Baseline

Artifact

Artifact

Artifact

Carlow's Achievement Tracking ePortfolio

Conceptual Design Considerations

• Use and Usability– Who

• Individual Student• Faculty Committees• Institutional Reviewers

– How • Individually• Local• Remote

Access and Security

• Levels– Individual Portfolio Contributor– Viewer of Groups

• Faculty

• Advisor

• Assessor

– Systems Administrator

Views of Data

• Contributor – Individual View

• Assessor– Selection of Statistical Sample or Cohort

• Administrator– Database Layer

Portfolio User Roles And Views

• Student– Individual and Detailed

• Advisor– Group, Mid Level, and Across Many ‘Topics’

• Faculty– Individual and Associated With a Course

• Assessor– Group, Aggregate, Anonymous

• Administrator– The Behind the Scenes Technical Operations

Evaluating Existing Models and Products

• “Notebook”– Physical Notebook– A digital notebook (e.g. CD)

• Web Envelope– Student has a web site and required elements

• Web Template– Framework with categories of items and ‘slots’

Meeting Our Needs

• Institutional Needs– Query the Contents– Select Samples– Report on Contents

• For Advisors

• For Assessors

Evaluation Considerations

• Ease of use regardless of role• Ease of maintenance (low support requirements in IT)• View as individual• View as advisor• View as curriculum assessor• Standard Reports• Ad hoc queries• Cost• Persistence of data

Technical overview

• Windows server 2003• Underlying Database is Sequel server• Cold Fusion is used to talk to the database

– Information is imported from the backend administrative database

– Accounts are maintained through cold fusion to create the person record rather than creating them as windows users on the server - account is not a windows account

– Each user has a unique login and directory space– Each user has a person record in the database

Front End

• Written in Cold Fusion code– CFML (cold fusion’s markup language)– HTML (standard hypertext)

• Password encrypted by Cold Fusion in the database

• Use Cold Fusion to create filing areas (directory maintenance) and other administrative tasks (account creation and storage area)

• Not using windows authentication

Templates

• Institutional– Set by assessor– Highly flexible

• Personal– Set by individual– Highly flexible

Template

• Description of Expected Artifact

• Desired Student Learning Outcomes

• Eligible Courses

Brief Tour

• http://cate.carlow.edu

• Patent Pending

Reports

• Written in Cold Fusion– Queries in sequel and pasting them into Cold

Fusion

Security• Accessible via the internet

• Turned off directory browsing– Owner can make items public (sets flag)– Queries pick up the template but only those

items that are marked public are available

• Passwords encrypted (via Cold Fusion routines)

Levels of access

• Administrative 1 • create accounts and directories, manage batch processing

• Administrative 2 – • add and inactivate template for the person, assign templates to students, reset

password

• Help Desk – • Password maintenance

• Institutional Assessor – • reporting across all portfolios

• Advisor / faculty – • see assigned students

– Designation located in administrative database and imported

• Individual – • sees own materials and advisor comments

Data Analysis Issues

• How to use the data to create information– Individual progress– Impact of Core Curriculum– Program effectiveness in the Major

• How to use the data to inform– Curriculum– Institutional practice

In Process

• Rubric Development

• Artifact Descriptors

• Evaluative Methodology

• Focus deeply and from multiple views on select student learning outcomes– Narrow and deep vs wide and shallow

Other Critical Issues

• Faculty buy in• Faculty training• Student buy in and

training

Additional Resources1. AAHE Electronic Portfolio Project

http://webcenter1.aahe.org/electronicportfolios/index.html

2. Angelo, T.A. and K.P. Cross. Classroom Assessment Techniques: A Handbook for College Teachers. San Francisco: John Wiley, 1993.

3. Cambridge, Barbara L. Ed. Electronic Portfolios: Emerging Practices for Students, Faculty, and Institutions.

4. Edwards, N., C. Heider, and R. Port. Common Rubrics Approach. Washington, DC: AACTE, 2001.

5. Huba, Mary E. and Jann E. Freed. Learner Centered Assessment on College Campuses: Shifting the Focus from Teaching to Learning. Boston: Allyn and Bacon, 2000.

Resources (con’t)

6. Palomba, C. A. and T. W. Banta. Assessment Essentials: Planning, Implementing and Improving Assessment in Higher Education. San Francisco: Jossey-Bass, 1999.

7. Siegal, Michael J. The College Student Experiences Questionnaire: Assessing Quality of Effort and Student Gains. First Year Assessment, May 10, 2003.

8. - - - - -. Primer on the Assessment of the First College Year, Brevard, NC: Policy Center of the First Year of College, 2003.

9. Student Learning Assessment: Options and Resources. Philadelphia: Middle States Commission on Higher Education, 2003.

10. Suskie, L. Assessment to Promote Deep Learning: Insights from AAHE’s 2000 and 1999 Assessment Conferences. Washington, DC: American Association for Higher Education, 2001.

Resources (con’t)

11. Swing, Randy Ed. Proving and Improving: Strategies for Assessing the First College Year (Monograph No. 33). Columbia, SC: University of South Carolina, National Resource Center for the First-Year Experience and Students in Transition, 2001.

12. Walvoord, B. E. and V. Johnson-Anderson. Effective Grading: A Tool for Learning and Assessment. San Francisco: Jossey-Bass, 1999.

13. Yogan, Lissa. Using Qualitative Methods to Develop Faculty as Stakeholders in Assessment: A Case Study. First Year Assessment Listserv, September 7, 2001.

List serv on Assessment – Policy Center on the First Year of College

FYA-LIST@LISTSERV.SC.EDU

http://cate.carlow.edu