+ All Categories
Home > Documents > Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew...

Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew...

Date post: 30-Mar-2015
Category:
Upload: mia-miah-biles
View: 216 times
Download: 3 times
Share this document with a friend
Popular Tags:
30
Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University
Transcript
Page 1: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

Using NSSE to InformCourse-Evaluation Revision

Edward Domber

Christopher J. Van Wyk

Drew University

Page 2: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

2

Mise en scène

• 23 items, written in the mid-1970s

• Scantron® with free response on reverse

• Conducted near end of semester

• Returned via dept chairs, 2-4 months later– In a plain manila envelope– With printout showing response distribution

for section and means for section, department, division, and College

Page 3: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

3

Spring, 2000, Survey(part of Middle States self-study)

• “Drew’s current use of student course evaluations for assessing teaching and learning is adequate” (1 = strongly disagree; 6 = strongly agree)

• Faculty response– mean: 3.5– s.d.: 1.5 (among the largest on the survey)

Page 4: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

Audience Participation

What would faculty on your campus say about student course-

evaluation forms?

Page 5: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

5

Memorable Words

• “When we surveyed several hundred faculty and administrators, we found a surprising lack of knowledge about the literature of student ratings and even about the basic statistical information necessary to interpret ratings reports accurately. That lack of knowledge correlated significantly with negative opinions about evaluation, student ratings, and the value of student feedback.” (Theall and Franklin, p. 46)

Page 6: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

6

Two Handy Starting Points

American Psychologist 52 (1997)

• Greenwald, ed.• incl. McKeachie

New Directions in IR, no. 109 (2001)

• Theall, Abrami, Mets, eds.

• incl. Theall & Franklin• incl. Kulik

Page 7: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

7

Re: Presentation of Results

• “[T]he use of norms not only leads to comparisons that are invalid but also is damaging to the motivation of the 50% of faculty members who find that they are below average. Moreover, presentation of numerical means or medians (often to two decimal places) leads to making decisions based on small numerical differences” (McKeachie, p. 1223, emphasis added)

Page 8: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

8

Re: Use of Results

• “[E]valuation experts usually advise teachers with low ratings to concentrate on their greatest relative weakness. Fix it, the experts advise, and the whole profile of ratings may go up. . . . changes in profile elevation are commonplace with highly intercorrelated rating scales” (Kulik, p. 22).

Page 9: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

9

Facts about Drew’s data

• On each of the 11 items that are answered on a scale of 1-7:– Modal response = 7 (the best possible)– Mean response ≈ 5.9– s.d. ≈ 1.3

Page 10: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

10

Analysis of Drew data, I

• Slight relationship (r < 0.2) between EXPECTED GRADE and other responses [N.B., “absence of evidence is not evidence of absence” (Rumsfeld)]

• 13 items load onto one factor that explains 38% of the variation in responses

• Both results replicate findings in the literature

Page 11: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

Added to this mix:NSSE results

Page 12: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

12

Analysis of Drew data, II

• Specific request: what can current course evaluations tell us about engagement?

• Regression analyses using as independent variables– Class size– Level of study– Curricular division– Reason for taking– Anticipated grade

Page 13: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

13

Class Size Matters

• Two definitions of average– Mean class size: 18 (“as seen by faculty”)– Mean class size weighted by enrollment:

almost 25 (“as seen by students”)– Why? “more students in a large class than in

a small class”

• Punch line: Student effort, reported amount of work assigned, and satisfaction all decrease as class size increases.

Page 14: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

14

Added Other Items

• Some phenomenological (e.g., pacing)

• “How often did instructor cancel class?”

• Some inspired by NSSE & mission stmt– Student inputs (e.g., how often missed

class)– Student outputs (e.g., how much class

contributed to ability to write clearly and effectively)

Page 15: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

15

Sample Items Relating Mission to NSSE Items

Questions 12-22 assess the extent to which this course contributes to various learning objectives. We understand that

not all courses are intended to contribute to all of the objectives listed.

With that in mind, please feel free to select “not at all” if that seems to be the most appropriate answer for this course.

Page 16: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

16

“College challenges students…to develop their capacities for:”

Phrase from Mission:

“critical thought”

NSSE Related Item (11c):

“This class contributed to my ability to think critically”

Page 17: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

17

“College challenges students…to develop their capacities for”

Phrase from Mission:

“Effective Communication”

NSSE Related Item (11d):“This class contributed to my ability to speak

clearly and effectively.”

Page 18: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

18

“College challenges students…to develop their capacities for:

Phrase from Mission:

“Problem Solving”

NSSE Related Item (11f):“This class contributed to my ability to

analyze quantitative problems.”

Page 19: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

19

“College challenges students…to develop their capacities for:

Phrase from Mission:

“Living… in an increasingly diverse world”

NSSE Related Item (11l):“This class contributed to my ability to

understand an increasingly diverse world.”

Page 20: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

20

“College challenges students…to develop their capacities for:

Phrase from Mission:

“Creativity”

NSSE Related Item (none?):“This class contributed to my ability to be

creative.”

Page 21: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

21

Faculty Discussions

• Draft circulated

• Sticking points– “How often was class cancelled?”– Length– Order of items– Unipolar v. bipolar Likert scales (!)

Page 22: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

22

Vote in May, 2005

• Linked two items’ fate– “student missed class” (min response: never)– “class cancelled” (min response: never)

• Provide cover sheet for instructors to explain “unusual circumstances”

• Condone length for now; follow-up will identify redundant questions

Page 23: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

23

Work in Fall 2005

• On-line version developed

• Pilot of on-line version to check technology– Small Sample of classes– Generate comments on each item– Check web-based interface

Page 24: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

24

Work in Spring 2006

• Further Refinement• Plan for Major Pilot-Testing

– 1/3 of Courses sampled– Communication with faculty and students carefully

designed– Both Paper (current) and on-line versions were

administered and linked– Administration window was planned– Ways of increasing response rate were considered,

but not instituted.

Page 25: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

25

Work in Fall 2006

• Pilot Data: Preliminary results – Response Rate, N=2397 (85% paper, 66% online,

with n=730 able to be matched)– Relationship of “paper” to new on-line form

• Comparable paper and on-line items• Correlation of items measuring quality

– Relationship holds within different course levels.

– Open-ended comments– Length, redundant items, and response rate– Considerable variability on NSSE related items– Factor structure

Page 26: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

26

Reporting to Faculty and Faculty Evaluators

• Should we provide several report formats?• Should we provide an overall rating of “teaching

quality”?• What kinds of norms should we supply

– Breakdown by class size?

• Should constituents get reports on-line?• How do we protect student “confidentiality”?• How do we report “qualitative” data?

Page 27: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

27

Should we

• Ask faculty to indicate their expectations for student responses to the “mission related” and “other” items? Discrepancies may provide a benefit similar to NSSE-FSSE comparisons.

• Develop an overall measure of quality and provide more norms?

Page 28: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

28

Work is Not Over

• Ongoing assessment was promised

• Eventually, hope to have a bank of optional items

• Suggestions for use by instructor

Page 29: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

29

Summary: NSSE’s Influence

• NSSE results engaged and enraged faculty

• NSSE inspired content of many new items

• NSSE results suggested new kinds of questions to ask on course evaluations

Page 30: Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

NSSE Workshop, SCSU, October 2006

30

Summary: NSSE’s Influence

• Embedding NSSE in course evaluations keeps NSSE prominent in faculty conversation

• New course evaluations will provide information useful for course revision, not just instructor evaluation

• Faculty able to derive useful information are less critical of the process.


Recommended