Learning Analytics - From theory to practice …...Learning analytics for educational decision...

Post on 31-Jul-2020

3 views 0 download

transcript

Learning Analytics - From theory to practice

Annelies Raes, WimVan den Noortgate, StijnVan Laer

Educational Technology Day

Learning Analytics

May 16, 2018

URL: www.pollev.com/smarted

Innovation trends: Outlook 2020

Fundamental societal transitions

© Center for Curriculum Redesign, 2018

Non-routine tasks

Routine tasks

Tasks not

supported by

ICT

Tasks supported by ICT

Changing focus in the curriculum

© Center for Curriculum Redesign, 2018

Non-routine tasks

with high level use

of ICT

Routine tasks

With high level use of ICT

Non-routine tasks

with low level use

of ICT

Routine tasks with

low level use of ICT

Changing focus in the curriculum

• Focus on ‘foundational skills’, 21st Century skills

Today’s challenge

• Valuing Practicing Assessing

Learning Analytics is in the air…

LA is supporting from different perspectives

(Janssen, Molenaar, & Van Leeuwen, 2018)

Different dimensions of LA

1. What? What kind of data does the system gather, manage, and use for the

analysis?

2. Who? Who is targeted by the analysis?

3. Why? Why does the system analyze the collected data?

4. How? How does the system perform the analysis of the collected data?

(Chatti et al, 2012)

Applying LA – three use cases

I. Using log data for adaptive item selection

(Wim Van den Noortgate)

II. Using log data for instructional design purposes (Stijn Van Laer)

III. Using multimodal data to monitor engagement during both face-

to-face and remote learning (Annelies Raes)

Using log data for adaptive item selection

Dmitry Abbakumov, Frederik Cornillie, Sean Joo, Trevor

Kadengye, Ellie Park, Kelly Wauters, Wim Van den Noortgate

Example: item-based e-learning

In-video formative assessment Summative assessment

Example: MOOCs

Example: educational game

DATA STRUCTURE

1 1 1 0 0 1 1 0 1

0 0 1 0 0 0 1 0

1 1 1 0 0 0

1 0 1 1

0 0 0 0 1

Using IRT models

Ability

Item 3 Item 1 Item 2

Testing persons using a ‘calibrated’ scale

Testing persons using a ‘calibrated’ scale

Comparing persons

Not necessarily same or equivalent test!

Progress testing

Evaluate learning

• Using intermediate assessments?

o often tests too short

o obtrusive

• Using responses on all tasks and exercises

requires dynamic models

Describe learning

N = 3803

I = 610

Describe learning

1 1 1 0 0 1 1 0 1

0 0 1 0 0 0 1 0

1 1 1 0 0 0

1 0 1 1

0 0 0 0 1

• Content

• Type (e.g., MC)

• Including video/audio

• …

• Age

• Previous courses

• Motivation

• …

• Time

• Location

• Previous behavior

• …

Explain learning

• How is learning influenced by:

o user characteristics,

o item characteristics,

o context characteristics,

o previous behavior, and

o interactions of these!

Theoretical interest

Practical interest: adaptation of the learning environment to the learner

Explain learning

Track learning ‘on the fly’

Personalisation of the learning environment!

Cold start

problem!

Cold start problem• New users + users that were absent for a long time

• Improvements:

o Use explanatory models to help the tracking algorithm

o Use previous trajectory

Without

background

information

With background

information

Complex tasks imply multiple skills …

Example Monkey Tales Educational game

EDUCATIONAL GAMETRACING MATH & GAMING SKILLS

Using log data for instructional design purposes

Stijn Van Laer & Jan ElenCentre for Instructional Psychology and Technology

Operationalization of SRL

cyclical, influenceable, and covert in nature

Operationalization of SRL

cyclical, influenceable, and covert in nature

Self-regulated learning and log files

• Investigating self-regulated learning

o Behaviour: Event sequence analysis (log files)

o Outcomes: Changes in cognitive, motivational, and meta-cognitive variables

(behavioural consequences)

o Behaviour + outcomes: Goal-directed learning behaviour or self-regulated

learning

• Data structure

o Computer log files (Learning Management System)

o Time Stamped Event (TSE) Data

o Data-driven approach (no recoding, transforming, etc.)

Example study

Research question

“How do cues for reflection in blended learning environments impact both learners’ (a) products and (b) processes of learning?”

Context and sample

Context

• Flemish centre of adult education – Second chance education;

• “AAV Wiskunde M2” – Basic statistics (surveys, frequency tables, etc.);

• Two eight week courses – courses co-designed by 2 instructors.

Sample

• 41 adult learners (25 woman – 16 man) and age ranging from below 20 to maximum 50 years

of age;

• No diploma secondary education – often plenty of experience;

• Exposed to mathematics and computers before.

METHOD (I) – Pre and post test

Description internal conditions: cognition and motivation

• Cognitive conditions

o Performance-based prior domain knowledge test – Test exam

o Performance-based domain knowledge test – Test exam

• Motivational conditions

o Achievement Goal Questionnaire - Revised (AGQ-R) (1) – Likert type scale

o Academic self-concept (ASC) scale (2) – Likert type scale

The test and questionnaire was piloted and tested for internal consistency to ensure the scale reliability. (3)

(1) Elliot, & Murayama, 2008; (2) Liu, & Wang, 2005; (3) Cronbach, 1951

METHOD (II) – Intervention

Description external conditions: design of environments

METHOD (III) – Products of learning

2 x 2 Mixed ANOVA

o Within-subject factor: Time (pre and post test)

o Between-subject factor: Condition (experimental and control)

Interaction effect Time x Condition?

Affordances and constraints

o Missing data: time point vs list wise deletion

o Possibility for post hoc tests

o Tracking and interpreting interactions

o Need for larger sample sizes

* SPSS

Event

E

Event

C

Event

B

Event

B

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Method (IV) – Processes of learningLog File dataset

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Event

A

Event

D

Event

C

Event

A

Event

D

Event

A

Event

E

Event

C

Event

B

Event

B

Event

B

Event

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Identification

* TraMineR in R Statistics

Event

E

Event

C

Event

B

Event

B

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Method (V) – Processes of learning

Event

A

Event

D

Event

C

Event

A

Event

D

Event

A

Event

E

Event

C

Event

B

Event

B

Event

B

Event

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Identification

SCORE A

SCORE B

SCORE C

SCORE DInte

rnal co

nd

itio

n Observed A

Observed B

Observed C

Observed D

Significant

difference?

* TraMineR in R Statistics

ANALYSIS

• Internal conditions: cognition and motivation (1)

o Prior domain knowledge - Test score - Cronbach alpha (.76)

o Motivation - Cronbach alpha’s (all between .73 and .95)

o Domain knowledge test - Test score - Cronbach alpha (.83)

• Products of learning (2)

o Tests of assumptions for normality (Shapiro–Wilks’ test), sphericity

(Mauchly's Test of Sphericity), and homogeneity of variances (Levene’s test)

o 2x2 Mixed ANOVA (Time as within-subject & Condition as between-subject)

• Processes of learning (3)

o Exploratory visualization of reflection cue use

o Frequent event sub-sequences (pMinSupport=.25 and K=undifined)

o Discriminant event sub-sequences per variable based on Chi Square (χ2)

(1) Cicchetti, 1994; (2) Elliot, & Murayama, 2008; Liu, & Wang, 2005; (3) Müller, Studer, Gabadinho, &

Ritschard, 2010

Results – Products of learning

• Within-subject effect of time:

o Performance approach orientation (F (1, 8) = 6.564, p = .034, ηp2 = .45)

o Learning confidence (F (1, 8) = 7.498, p = .026, ηp2 = .48)

o Domain knowledge (F (1, 8) = 46.716, p < .001, ηp2 = .85)

• Within-subject effect of timexcondition:

o Performance avoidance orientation (PAV) (F (1, 8) = 7.374, p = .026, ηp2 = .48)

Regardless the intervention, both learners in the control and experimental condition

increased in performance approach, learning confidence and domain knowledge.

In the experimental condition learners increased in performance avoidance

orientation where learners in the control condition decreased.

Results – Processes of learning

• Effect of condition on significant discriminant sub-sequence :

o Sig. more sequences including communication tools (χ² (2) = 5.49, p = .02)

o Sig. more sequences with regard to task submission (χ² (2) = 8.33, p = .003)

o Sig. more sequences relating to tests (χ² (2) = 6.27, p = .01)

* Discriminant sub-sequences

In the experimental condition learners used significantly more sequences related to

features that enable testing and comparison.

Discussion (I)

Products of learning• Increase over time on performance approach, learning confidence and domain knowledge.

• The reflection cues provided only increased learner’ performance avoidance orientation.

Processes of learning• Significantlymore sequences related to features that enable testing and comparison.

o Mastery goals focus on gaining competence while performance goals focus on demonstrating performance to others andoneself (e.g., Kovanovi, Gasevic, Joksimovi, Hatala, & Adesope, 2014).

o Cues might have evoked learners to test their own performance compared to others and so help reduce the anxietyassociated with a feeling of a potential looming failure (e.g., Crippen, Biesinger, Muis, & Orgill, 2009).

o Learners seek to demonstrate that they are not incompetent and hence not doingworse than others (Collazo, Elen, & Clarebout, 2015).

Discussion (II) – Cue use

* Tableau 10.3

No significant impact of (prior)

learner characteristics on the

distribution, frequency, or

lifecycle of use of the

reflection cues.

Conclusion (I) – Reflection cues

Increase (over time) in performance avoidance orientation – products

Cues used, no significant differences among learners – cue use

Significant behavioural changes (use of sequences containing test, assignments, and

communication tools) – processes

Although the cues were used, the intervention presented did not

seem to support the learners sufficiently toward increased

(favourable) learning outcomes, implying that supporting reflection

only is insufficient to evoke goal-directed learning behaviour.

Conclusion (I) – Reflection cues

Increase (over time) in performance avoidance orientation – products

Cues used, no significant differences among learners – cue use

Significant behavioural changes (use of sequences containing test, assignments, and

communication tools) – processes

Although the cues were used, the intervention presented did not

seem to support the learners sufficiently toward increased

(favourable) learning outcomes, implying that supporting reflection

only is insufficient to evoke goal-directed learning behaviour.

Elaborated insights through Learning Analytics(opportunities and challenges)

Continuous, unobtrusive, and predictive

Event

A

Event

D

Event

E

Event

C

Event

B

Event

C

Event

A

Event

D

Event

BProduct

Event

A

Event

D

Event

E

Event

C

Event

B

Event

C

Event

A

Event

D

Event

BProduct

Internal and external

conditions

Intervention

Intervention Intervention

Opportunities

(but also…)

Event

E

Event

C

Event

B

Event

B

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

OperationalizationLog File dataset

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Event

A

Event

D

Event

C

Event

A

Event

D

Event

A

Event

E

Event

C

Event

B

Event

B

Event

B

Event

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Event

A

Event

D

Event

E

Event

C

Event

B

Event

B

Event

C

Event

A

Event

D

Event

A

Event

B

Event

Identification

Challenges

Unit of analysis and grainsize

Feature A Feature B Feature C Feature …

Level 1 Level 2 Level 3 Level …

External conditions

Internal conditions

Learn

er

beh

avio

r

Measurement

Challenges

Level of Inferences

State? State?

Data level

Interpretation

Inferences

Event

1

Event

2 State? State?Event

3

Event

State …

Co

gn

itiv

e

pro

ces

ses

State 1 State 2 State 3

Self-regulated learning behavior

Challenges

Using multimodal data to monitor engagement during face-to-face and

remote learning

Annelies Raes, Fien Depaepe, Pieter Vanneste,

WimVan den Noortgate, Ine Windey

Learning analytics for educational decision making

(Janssen, Molenaar, & Van Leeuwen, 2018)

Cartoon: Joris Snaet

New challenges in future learning spaces

• Student perspective:

Remote/online students have different

experiences How to make sure students

still feel connected and stay engaged?

o Teacher perspective

increased workload Cognitive

(over)load of teacher when teaching

and overseeing the smart classroom“We want to know when we are loosing our students”

Remote classroom Hybrid virtual classroom

Enhancing interactivity

Interactive lecture: Quizzes, polls & silent questions

Passive (Inter)active

Main objectives

• The effect of interactive quizzes on students’

engagement?

o How to measure engagement?

Engagement = complex & multidimensional concept

Behavioral

E.g. attending lectures, asking questions,

participating in quizzes …

Emotional

E.g. students’ feelings of interest,

happiness, anxiety, sense of belonging

Cognitive

The skills, and the strategies they

employ to master their learning

Aspects of student engagement within taught contexts

(Dobbins & Denton, 2017)

Methodological challenge & opportunities

• Self-report measures (e.g. motivation scales) have many shortcomings and biases:

a snapshot articulation of engagement rather than examining how emotion & behavior

unfolds in an interactive context (Schrerer, 2004)

• New technologies have the opportunity to observe, measure, and understand learning

and assessment processes more objectively, online and determined in real time.

(Gomez & Danuser, 2007; Sakr, Jewitt & Price, 2016)

Po

ste

st

Process data

0 1 1 0 1

log data Psycho-

physiological

data

Audiovisual

data

Multimodal learning analytics to capture ‘engagement’ cognitive

validation?

Self-report

ENGAGEMENT

METER

1) Optimal use of quizzes within lectures?

2) The effect of quizzes on students’ engagement?

1.

Pilot (2

quizzes)

2.

Quizzes

3.

1 quiz

4.

4 quizzes

5.

0 quizzes

6.

3 quizzes

1. Self-report

4.424.22

4.364.11 4.11

4.4

1

1.5

2

2.5

3

3.5

4

4.5

5

5.5

6

Intrinsieke motivatie: geboeid / cognitieve prikkeling

Intrinsic motivation

Les 1 (2Q) Les 2 (2 Q) Les 3 (1Q) Les 4 (4 Q) Les Les 5 (0 Q) Les 6 (3 Q)

N = 17

1. Self-report

4.65 4.744.55

4.794.6

4.97

1

1.5

2

2.5

3

3.5

4

4.5

5

5.5

6

Expectations Lecture 1 (2quiz.)

Lecture 2 (2quiz.)

Lecture 3 (1quiz)

Lecture 4 (4quiz.)

Lecture 5 (0quiz.)

Lecture 6 (3quiz.)

Expected & experienced effect on learning (PU) of quizzes

No novelty

effect

1. Self-report

83%

17%

Lecture 5 without quizzes: Have you missed the quizzes?

Yes, it would have more fun with 1 or more quizzes No

2. Audiovisual dataSTEP 1 – Recording interactive courses STEP 2 – Annotating individual students’ behaviour

STEP 3 – Use computer vision techniques to

automatically recognize these actions in real-

time

STEP 4 – Use machine learning techniques to discover

patterns between the behaviour and manual annotations

Po

stte

st

Process data

Log data

3. Log dataParticipation in quizzes and accuracy

0 1 1 0 1

Po

stte

st

Process data

Log data

3. Log dataWeb browser activity (‘Engagement Monitor’

by Distrinet)

0 1 1 0 1

Po

stte

st

Process data

4. Psychophysiological dataWeb browser activiteit

Q1 Q2 Q3 Q4

Po

ste

st

Process data

4. Psychophysiological data

Q1 Q2 Q3 Q4

Work in progress

• Modelling dis)engagement

• Visualising (dis)engagement in a dashboard educational decision making

• Implementation & testing within the context of multilocation learning

Thank you for your attention.

Suggestions or questions?