James Bond, Monorail Cat and Partying penguins. What happens when you let student design their own...

Post on 24-Apr-2015

1,092 views 0 download

description

 

transcript

James Bond, internetmemes and partyingpenguins

(or, what happens when students write their own assessment content)

Simon BatesPearson Strategies for Success WorkshopToronto, May 2013

2

3

Overview

I. Motivation

II. Technology enabler: PeerWise

III. Use cases

IV. Engagement, learning, question quality?

Overview

I. Motivation

II. Technology enabler: PeerWise

III. Use cases

IV. Engagement, learning, question quality?

Overview

I. Motivation

II. Technology enabler: PeerWise

III. Use cases

IV. Engagement, learning, question quality?

The University of EdinburghEdinburgh, Scotland

5th July, 2010

Paul Denny

PeerWisebridging the gap between online learning and social media

Department of Computer ScienceThe University of AucklandNew Zealand

• Web-based Multiple Choice Question repository built by students

• Students:– develop new questions with

associated explanations– answer existing questions and rate

them for quality and difficulty– take part in discussions– can follow other authors

peerwise.cs.auckland.ac.nz

>100,000>100,000

student contributors

>500,000>500,000unique questions

>10,000,000>10,000,000answers

14

As a question author…..

15

16

19

As a question answerer …..

20

21

22

23

Overview

I. Motivation

II. Technology enabler: PeerWise

III. Use cases

IV. Engagement, learning, question quality?

Timeline

2010-11: UoE pilot study

2011-12: Multi-institution, multi-course

2012-13: UBC PHYS 101

Coursera MOOC

25

Pilot year (2010-11) – replace single handin

PeerWise was introduced

in workshop sessions

in Week 5

Students worked through

structured example task

and devised own Qs in groups.

All these resources are available online (see final slide)

26

An assessment was set for the end of

Week 6:

Minimum requirements:

• Write one question• Answer 5• Comment on & rate 3

Contributed ~3% to course assessment

27

28

We were deliberately

hands off.

• No moderation

• No corrections

• No interventions at all

But we did observe…..

29

• JISC project – SGC4L

N (students) ~800N (staff) ~10

Overview

I. Motivation

II. Technology enabler: PeerWise

III. Use cases

IV. Engagement, learning, question quality?

Generally, students did

• Participate beyond minimum requirements

• Engage in community learning, correcting errors

• Create problems, not exercises

• Provide positive feedback

34

35

Generally, students did not

• Contribute trivial or irrelevant questions

• Obviously plagiarise

• Participate much beyond assessment periods

• Didn’t all leave it to the last minute

36

• Phys 101 uptake graph showing midterm

39

40

Overview

I. Motivation

II. Technology enabler: PeerWise

III. Use cases

IV. Engagement, learning, question quality?

Correlation with end of course outcomes

Quartiles

Q4 – top 25%

Q3 – upper middle

Q2 – lower middle

Q1 – bottom 25%

22 students did not take the FCI

1st year Physics N=172University of Edinburgh

1st year Physics N=172University of Edinburgh

Overview

I. Motivation

II. Technology enabler: PeerWise

III. Use cases

IV. Engagement, learning, question quality?

Comprehensive categorisation of >50% of repository for two successive academic years

Principal measures to define a ‘high quality question’

- cognitive level of question- explanation quality- other binary criteria

Category Description 6 Create (synthesise ideas) 5 Assess 4 Analyse (multi-step) 3 Apply (1-step calcs.) 2 Understand 1 Remember

Cognitive level of question

Explanation

0 – Missing

1 – Inadequate(e.g. wrong reasoning / answer, trivial, flippant, unhelpful)

2 – Minimal (e.g. correct answer, but with insufficient explanation or justification, aspects may be unclear)

3 – Good/Detailed(e.g. clear and sufficiently detailed exposition of correct method and answer)

4 – Excellent (e.g. Describes physics thoroughly, remarks on plausibility of answer, use of appropriate diagrams, perhaps explains reasoning for distractors)

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

50%

1 2 3 4 5 6

Taxonomic Category

Per

cen

tag

e o

f S

ub

mit

ted

Qu

esti

on

s

First semesterFirst semester N = 350 N = 350

Second semester N = 252Second semester N = 252

Results: Question level Physics 1A / 1B 2011

Results: Explanation Physics 1A 2010 and 2011

‘High quality’ question

1. At least 2/6 on cognitive level (“understand” and above)

2. At least 2/4 on explanation (“minimal” and above)

3. Clearly worded question (binary)

4. Feasible distractors (binary)

5. ‘Most likely’ correct (binary)

6. ‘Not obviously’ plagiarised (binary)

Results: Physics 1A 2010 and 2011

2 successive years of the same course (N=150, 350)

•‘High quality’ questions: 78%, 79%

•Over 90% (most likely) correct, and 3/5 of those wrong were identified by students.

•69% (2010) and 55% (2011) rated 3 or 4 for explanations

•Only 2% (2010) and 4% (2011) rated 1/ 6 for taxonomic level.

Literature

Bottomley & Denny Biochem and Mol Biol Educ. 39(5) 352-361 (2011)

•107 Year 2 biochem students •56 / 35 / 9 % of questions in lowest 3 levels.

Momsen et al CBE-Life Sci Educ 9, 436-440 (2010)

“9,713 assessment items submitted by 50 instructors in the United States reported that 93% of the questions asked on examinations in introductory biology courses were at the lowest two levels of the revised Bloom’s taxonomy”

• High general standard of engagement and student-generated questions

• Relatively few basic knowledge questions

• Transferable across disciplines / institutions

• We hypothesise scaffolding activities are critical for high level cognitive engagement

Summary

Acknowledgements

Physics 101 course team

Georg RiegerFiras MoosviEmily Altiere UBC CWSEI

simon.bates@ubc.ca

@simonpbates

Ross GallowayJudy HardyKaron McBrideAlison KayKeith BruntonJonathan RiiseDanny Homer

Chemistry – Peter Kirsop

Biology – Heather McQueen

Physics – Morag Casey

Comp Sci – Paul Denny

Community: http://www.PeerWise-Community.org

JISC-funded multi institution study:https://www.wiki.ed.ac.uk/display/SGC4L/Home

UoE Physics Pilot Study: AIP Conf. Proc. 1413, 359 http://dx.doi.org/10.1063/1.3680069

RSC overview article

http://www.rsc.org/Education/EiC/issues/2013January/student-generated-assessment.asp

UoE Physics scaffolding resources http://www2.ph.ed.ac.uk/elearning/projects/peerwise/

Resources

Question quality analysis (1st year Physics University of Edinburgh)

Assessing the quality of a student-generated question repository, submitted to Phys Rev, ST Phys Educ Res.

Multi-institution, multi-course study

Student-generated content: Enhancing learning through sharing multiple-choice questions, submitted to

International Journal of Science Education

Scaffolding Student Learning via Online Peer Learning, submitted to International Journal of Science Education

Publications in preparation / review / press

Cop

yrig

ht 2

01

3 G

raha

m F

ow

ell /

Th

e H

itma

n,

re-p

rodu

ced

with

per

mis

sio

n,

Edu

catio

n In

Che

mis

try,

V

ol 5

0 N

o 1

(20

13

)

Photo credits

Photo credits

Community: http://www.flickr.com/photos/kubina/471164507/

Screen grab from Mwensch ‘A vision of students today’ http://www.youtube.com/watch?v=dGCJ46vyR9o

65