+ All Categories
Home > Documents > Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive...

Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive...

Date post: 09-May-2020
Category:
Upload: others
View: 9 times
Download: 0 times
Share this document with a friend
37
Learning analytics interventions to support the transition from secondary to higher education STELA Erasmus+ project (562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD)
Transcript
Page 1: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Learning analytics interventions

to support the transition from secondary to highereducation

STELA Erasmus+ project (562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD)

Page 2: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

"The European Commission support for the production of this publication does not constitute anendorsement of the contents which reflects the views only of the authors, and the Commission cannotbe held responsible for any use which may be made of the information contained therein."

2

Page 3: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Contents

Contents

1 Introduction to the themes 4

2 MOOCS and SPOCs for future students 62.1 Learning tracker . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.1.1 Learning Tracker in MOOCs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.1.2 Learning Tracker in SPOCs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.2 Dashboard for teachers and course builders . . . . . . . . . . . . . . . . . . . . . . . . 102.2.1 Three underlying principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3 Feedback on academic skills 123.1 First-year students - LASSI dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . 123.2 Future students - POS dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.3 Three underlying principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

4 Feedback on academic achievement 194.1 First-year students and beyond - REX dashboard . . . . . . . . . . . . . . . . . . . . . 194.2 Future students - POS dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224.3 Three underlying principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

5 Feedback on academic engagement 265.1 The NTU Student Dashboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265.2 Three underlying principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

6 Conclusion 296.1 Three principles underlying the different interventions . . . . . . . . . . . . . . . . . . 296.2 Sustainability of learning analytics interventions . . . . . . . . . . . . . . . . . . . . . . 316.3 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

Authors 35

Bibliography 37

3

Page 4: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 1. Introduction to the themes

Chapter 1

Introduction to the themes

This report discusses the learning analytics interventions supporting the transition from secondary tohigher education as developed within the STELA project [1].

The interventions are organized thematically in the different chapters:

• Chapter 2 elaborates on the use of Massive Open Online Courses (MOOCS) and Small andPrivate Online Courses (SPOCS) to support prospective students in their transition to highereducation.

• Chapter 3 elaborates on the use of learning dashboards to provide feedback to both prospectiveand first-year students regarding their academic skills.

• Chapter 4 also focuses on learning dashboards for both prospective and first-year students butnow regarding feedback on academic achievement.

• Chapter 5 elaborates on a learning dashboard providing feedback on academic engagement.

The interventions developed within the STELA project [1] are founded on three common underlyingprinciples:

Principle 1: Actionable feedback

Feedback has been a proven powerful tool for improving student achievement, but its effectivenessdepends on the type of feedback and the circumstances under which feedback is given [11]. Duringthe transition from secondary to higher education this feedback is considered pivotal regardingstudent motivation, confidence, retention, and success [14, 16].

The interventions within the STELA project focus on actionable feedback. This means thatthe feedback should allow those receiving the feedback (student, prospective student, teacher, etc.)to take action based on the provided feedback. Or differently stated, feedback is only actionableif it provides an opportunity for improvement to those receiving the feedback.

Principle 2: Good return-on-investment

Interventions are designed such that they allow for a good return-on-investment. The interven-tions within the project aim at achieving this good return-on-investment in two ways:

• by targeting particular online courses, reaching a lot of users, with tailored-solutions (Chap-ter 2),

• by targeting entire programs at higher education institutes with rather generic, but scalablesolutions such that they can be easily transferred to other programs or higher educationinstitutes (Chapter 3, Chapter 4, Chapter 5).

4

Page 5: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 1. Introduction to the themes

Both types of interventions focus on scalable implementations of learning analytics.

Principle 3: Social comparison

The transition from secondary to higher education is challenging both from the academic andsocial perspective. Students have to adapt their study and learning strategies to the new contextof higher education, but, a priori it is often not clear for students how and to what extendthey have to adapt. The social-comparison theory [10] states that people evaluate their abilitiesthrough comparison to others when they are lacking objective means of comparison. As studentsenter a new social group when starting in higher education, they lack a comparison framework,which induces uncertainty about their abilities. Social comparison has shown potential to increasestudent achievement [2? ].

The feedback in the interventions of the STELA project provide prospective or first-yearstudents with the ability to position themselves with respect to their peers.

The following chapters will shortly summarize how the three above principles apply to each of theinterventions.

5

Page 6: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 2. MOOCS and SPOCs for future students

Chapter 2

MOOCS and SPOCs for future students

This chapter elaborates on two learning analytics interventions that support online courses focusingon aspiring students.

2.1 Learning tracker

The idea of a learning tracker is to provide learners with feedback on their online learning. Offeringlearners the opportunity to compare their behavior with that of their peers promotes increased studentachievement in formal learning environments [2, 12]. Students in in-person classrooms can easilyidentify role models and regularly monitor these role models’ behavior and compare it to their own.However, this affordance of social comparison is missing in most online “classrooms”. Instead, onlinelearners need to be self-directed and regulate their learning process independently with sparse socialand normative signals.

2.1.1 Learning Tracker in MOOCs

(The description of the Learning Tracker below is based on [9])

The Learning Tracker case study explored to what extent the insights from social comparisoncan be translated to support learners in Massive Open Online Courses (MOOCs). To this end a“Learning Tracker” was developed that provides feedback based on social comparison in an onlinelearning environment. This Learning Trackers is a personalized and scalable feedback system thatpresents MOOC learners with a visual comparison of their behavior to that of their successful peerswho successfully completed a past iteration of the course. Figure 2.1 presents an example of theLearning Tracker.

The Learning Tracker was built into the edX MOOC platform: the visualization was placed in theWeekly Introduction unit of each course week so that it would be readily available and immediatelyvisible to learners upon entering the new course week, enabling them to react on their learning behaviorso far.

In the study performed within the STELA project the Learning Tracker was applied to the Pre-University Calculus MOOC developed by TU Delft1. This self-paced MOOC aims at preparingthe learners for the Introductory Calculus courses in the first year of higher education by revising fourimportant mathematical subjects that are assumed to be mastered by beginning Bachelor students:functions, equations, differentiation and integration. The MOOC consists of six modules of one weekeach, and one final exam. The modules consist of a collection of three to five minute lecture videos,inspirational videos on the use of mathematics in Science, Engineering and Technology, (interactive)exercises, homework, and exams. The MOOC is offered for free, although a verified certificate can beobtained for $50 USD.

The learning tracker for the Pre-University Calculus MOOC provided feedback on six summarizingmeasures for the learners online activity: quiz submission timeliness (days), number of quiz questions

1Pre-University Calculus MOOC: https://www.edx.org/course/pre-university-calculus

6

Page 7: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 2. MOOCS and SPOCs for future students

Figure 2.1: Example of Learning Tracker visualizing the learning activity of a MOOC learner incomparison to past users who successfully completed the MOOC [9].

attempted, average time on the platform per week (in hours), number of revisited video lectures,number of forum contributions, percentage of time spent on quizzes [9]. After each MOOC courseweek, the metrics are calculated using the cumulative activity over all weeks so far.

The learning tracker visualizes the activity profile of a student (blue area in the graph) usingthe above six measures in a radar chart as shown in Figure 2.1. To motivate users to be active inthe MOOC, the learning tracker additionally visualizes the activity profile of previous learners thatsuccessfully completed the MOOC (yellow area in the graph).

Three underlying principles

Principle 1: Actionable feedback

The learning tracker provides learners with feedback on summarizing learning measures derivedfrom their learning activities on the MOOC. The summarizing measures are defined such thatthey are interpretable to the learners as they are directly connected to particular activities in theMOOC (videos, quizes, forum). Based on the feedback learners can adapt their activities in theMOOC for the following weeks.

Principle 2: Good return-on-investment

The learning tracker can be integrated within the MOOC itself. It has the potential of beingautomatically generated from the log files in the MOOC, showing large potential for scalability.Furthermore, MOOCs typically reach a huge number of learners, which more easily makes theinvestment in learning analytics worthwile.On the negative side, the summarizing learning activities that are predictive for MOOC com-pletion can depend on the particular MOOC and its course design. Therefore, the summarizingmeasures of the learning tracker should be reassessed when used in a different MOOC.

7

Page 8: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 2. MOOCS and SPOCs for future students

ID Variable DescriptionVariables related to accesses to the platform2 ndays Number of days the student has accessed to the platformVariables related to interactions with videos9 per_open Percentage of opened videosVariables related to interactions with exercises13 avg_grade Average grade of formative exercises (only using the first attempts)14 avg_attempt Average number of attempts in the exercises attempted15 per_correct Percent of correct exercises over attempted exercises (using all attempts)16 CFA Number of 100% Correct exercises in the First Attempt17 streak_ex Longest consecutive run of correct exercises18 nshow Number of times the user asks for the solution of an exercise (without

submitting an answer)

Table 2.1: Seven summarizing activity metrics for a SPOC most predictive for passing the entranceexam of medicine[13].

Principle 3: Social comparison theory

The learning tracker allows each learner to compare his/her own learning activities with pastlearners that successfully completed the MOOC. By providing such “up-ward” comparison, thelearning tracker stimulates activities within the MOOC and can increase MOOC completion [9].

2.1.2 Learning Tracker in SPOCs

(The description of the cases study below is based on [13])

Within this case study it was explored if the idea of the Learning Tracker as developed forMOOCs (Section 2.1.1) is also applicable to SPOCs (Small and Private Online Courses).

The case study for the STELA project [1] focused on a SPOC about chemistry, which was developedin Edge edX as a joint project of the Faculty of Science and the Faculty of Medicine at KU Leuven. TheSPOC consists of 11 modules including 66 videos and 121 exercises, which cover the required contentsfor the chemistry component of the medicine admission test in Flanders. This entrance exam containsseveral tests, although this SPOC was focused only on chemistry. The SPOC was part of a blendedlearning support program: online modules were released gradually every fortnight (from Septemberto May) and alternated with three face-to-face interactive sessions. Nevertheless, in practice manystudents enrolled late and they studied at their own pace. The target users were students in thelast year of secondary school (in the academic year 2016-2017) who wanted to enter Medicine in anyuniversity in Flanders and paid a registration fee for the blended learning program.

The focus of the case study was to research if summarizing measures for learning activity can bedefined based on the online SPOC activity that are predictive for passing the high-stakes entranceexam. When this would be possible, a learning tracker similar to the ones used in MOOCs couldbe built. To this end a predictive analyses was performed using 18 summarizing learning activitymeasures. Results show that there is statistical significant difference in most of the summarizinglearning measures between students passing the entrance exam and the ones failing thee entranceexam, which suggests that the learning behavior in the SPOC relates to success in the entrance exam.

Data from the run of 2016 was used. In this run a total of 1,062 students accessed the course,although only 680 completed at least one exercise and only 750 had interactions with videos.

To determine which learning activity measures should be visualized in the learning tracker, themeasures that contribute most to entrance exam success were identified. As a result eight summarizingmetrics, shown in Table 2.1, were obtained.

With these eight variables, the learner tracker was designed, which includes the summarizing

8

Page 9: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 2. MOOCS and SPOCs for future students

(a) Activity profile of a learner who passed the entranceexam.

(b) Activity profile of a learner who failed the entranceexam.

Figure 2.2: Radar charts visualizing the activity profile of a learner with respect to the averagesof learners who passed and failed the entrance exam, for which the SPOC is preparing the learners[13]. The eight dimensions for learning activity are: (per_correct) percent of correct exercises overattempted exercises, (avg_grade) the average grade of formative exercises, (nshow) the number oftimes the user asks for the solution of the exercise, (ndays) the number of days the student accessedthe SPOC, (avg_attempts) the percentage of attempted exercises in total, (CFA) the number of 100%correct exercises, (streak_ex) the longest consecutive run of correct exercises, and (per_open) thepercentage of opened videos.

learning measures for each learner at hand together with the average learning measures of the successfuland unsuccessful learners.

Figure 2.2 presents the visualization of the selected variables for one leaner who passes and anotherwho fails. The aim is that students can see their performance (through some indicators) and compareit with previous students, so they can get an idea about how well they are doing well with respectto other learners. The tracker may also help students to self-regulate better. In this example, theclear differences between learners might imply that the visualization can be useful to give feedback onactivities within the SPOC.

Nevertheless, a limitation in this case study is that the learning activity was only significantlydifferent towards the end of the course, as most learners are active only in the final weeks. A week-by-week visualization as in the MOOC case study cannot be achieved in the case of the SPOC.

The analysis of the data was supported by the teachers and course builders. This proved to berequired in order for the researchers to understand the data and to adapt the analysis to the context.The teachers and course builders had to provide additional context information on the target audience,timing of the course, the setup of the course (blended learning), and the assessment within the onlinecourse and the final test in particular.

Rec. 1: Use all available expertise

Teachers and course builders will provide the contextual information that is often needed tomake sense of the data provided in online courses. They can add information on the targetedaudience, timing of the course, the setup of the course (e.g. blended learning), etc.

9

Page 10: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 2. MOOCS and SPOCs for future students

Three underlying principles

Principle 1: Actionable feedback

Similar to the learning tracker in MOOCs.

Principle 2: Good return-on-investment

As SPOCs typically reach a much smaller audience than the MOOC, the return-on-investment willbe harder to obtain. On the other hand, SPOCs typically have a less heterogeneous population,allowing for more tailored feedback, which could have more impact. This however still has to beproven.Learners in the SPOC in this case study had to pay for the blended learning module the SPOCwas part of. When students pay for a course, they typically have higher expectations on thesupport that is provided to them. In the future, a learning tracker could contribute to thissupport.

Principle 3: Social comparison theory

The learning tracker in the SPOC would allow each learner to compare the own learning activitieswith past learners that successfully passed the entrance exam, for which the SPOC is providingpreparatory material.

2.2 Dashboard for teachers and course builders

Another STELA project case study focused on developing a dashboard for teachers and course buildersfor Small and Private Online Courses. The case study focused on the SPOC chemistry of KU Leuven(see previous section). The primary aim of the dashboard was to be a show case for the modularopen-source technology stack composed within the project. With the use of the technology stack, thedashboard was designed to answer three questions of the teachers and course builders:

1. Which student factors (gender, prior education, “typical student” preparing for the entranceexam) are related to SPOC activity?

2. Are students that pass the final SPOC test on average more active than students not passingthe final SPOC test? This question relates SPOC activity to successful SPOC completion.

3. Are students that pass the entrance exam on average more active than students not passing thefinal SPOC test? This question relates SPOC activity to success at the entrance exam.

Figure 2.3 shows a screenshot of the dashboard.

Demo SPOC teacher dashboard

An online demo of the SPOC teacher dashboard providing insights on how learners used a SPOCis available at:https://stela.tugraz.at/

2.2.1 Three underlying principles

10

Page 11: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 2. MOOCS and SPOCs for future students

Figure 2.3: Screenshot of the dashboard for teachers and course builders for the SPOC chemistry.

Principle 1: Actionable feedback

The dashboard provides teachers and course builders with insights on the activity of learnerswithin a SPOC. The dashboard was built to answer three questions of teachers regarding theinfluence of student factors on online activity and the relation between online activity and twooutcome measures (successful SPOC completion and passing the entrance exam).

Principle 2: Good return-on-investment

The return-of-investment on this dashboard remains to be shown. While the dashboard could pro-vide the teachers and course builders with insights that might lead to improvements in the course,no improvements in the course itself were yet triggered after the deployment of the dashboard inthe case study.

The dashboard itself most importantly showed that the modular open-source technologystack allows to develop a learning dashboard. Thanks to the actual experiment in the case study,the choices within the technology stack were challenged and improved.

Principle 3: Social comparison

Within the dashboard the activity of learners is compared based on certain background charac-teristics such as gender and prior education.

No direct social comparison is used for the target users, the teachers and course builders ofthe SPOC.

11

Page 12: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 3. Feedback on academic skills

Chapter 3

Feedback on academic skills

This chapter elaborates on two learning dashboard interventions that provide student with feedbackon their academic skills during the transition from secondary to higher education.

3.1 First-year students - LASSI dashboard

The text below is based on the STELA project case studies published in [3, 7].

Within the STELA project a learning dashboard providing feedback regarding learning and study-ing skills, named LASSI, was designed, developed, and deployed.

Demo LASSI learning analytics dashboard

An online demo of the LASSI dashboard providing actionable feedback regarding five learningand studying skills is available at:https://learninganalytics.set.kuleuven.be/static-demo-lassi/.

Context In the transition from secondary to higher education, students are expected to develop aset of learning skills that will help them in their learning path, as well as in their future professionalcareer. Higher education institutions provide information and activities to support students to improvetheir learning skills, e.g. through coaching, counseling, or training sessions. To direct these efforts andto measure their effectiveness, institutions need to assess the level of learning skills of their students.The Learning and Study Strategies Inventory (LASSI) is a diagnostic instrument that can be used tomeasure a student’s level of learning skills [20]. Based on a 60 (third edition) item questionnaire, aLASSI test reveals strengths an weaknesses of an individual, and relates this to the scores of otherstudents. Being both a diagnostic and prescriptive instrument, LASSI does not focus on studentcharacteristics that are invariable or difficult to change, such as gender, socioeconomic status, or ethnicbackground. Rather, it delivers indicators for areas that offer a perspective for growth and mitigation.Student’s learning and study strategies are summarized in ten scales, each targeting a specific skillshown to be relevant to study success [20]. According to its publisher, the test is currently being usedin over 3000 institutions worldwide.

Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills as measured through LASSI, was designed, developed,and deployed. The feedback targets five scales of the original ten LASSI scales that were shown tobe best predictive for study success for our specific target group of STEM students [17]. These fiveLASSI scales are: performance anxiety, concentration, motivation, the use of test strategies, and timemanagement.

Figure 3.1 provides a screenshot of a student’s view on the dashboard. The main components ofthe dashboard are introduced below. Two weeks after completing the questionnaire, an email was send

12

Page 13: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 3. Feedback on academic skills

Figure 3.1: Screenshot of a student’s view on the LASSI dashboard providing feedback on five learningand studying skills.

to students that filled in the questionnaire with a link to the dashboard.

Tabs The dashboard is divided into six tabs. On access, the first tab shows an introductory text,explaining the purpose and components of the intervention and the origin of the data it is based on.The other five tabs, alphabetically ordered, provide a separate space for each of the five learning skills.

Learning skills visualization The dashboard contains a visualization of the data underlyingthe intervention that allows students to compare their learning skills with those of peers in the sameprogram (social comparison theory). A simple unit chart (Fig. 3.2) uses dots to represent the numberof students within the respective norm scales for each of the five included learning skills. Each dotrepresents a single student and is attributed to one of five norm scales, ranging from very weak(concentration, motivation, test strategy, time management) over weak, average and good to verygood. For the failure anxiety scale, the labels were adapted to very high, high, average, low and verylow respectively. The norm group that applies to the active student, is marked by a grey box.

For each skill, a second unit chart (Figure 3.3) relates the skill level of previous year’s studentsof the study program with their study success obtained in the first year. This allows the studentsto relation between the learning skill at hand and study success. Here, each square represents 1% ofthe students of earlier cohorts within the respective norm group. In addition, the color of the squaresrepresents the study success of these students using three categories depending on the percentage ofobtained study points, the so called study efficiency 1 (orange < 30%, yellow ≥ 30% and < 80%;green ≥ 80%). The colors are adapted for student with color vision deficiency [15]. To support theinterpretation of the graphs, textual explanation is provided.

1study efficiency: https://www.kuleuven.be/english/education/student/studyprogress/cse

13

Page 14: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 3. Feedback on academic skills

Figure 3.2: Visualization comparing the student’s time management score (average) to the peers inhis/her program.

Figure 3.3: Visualization comparing the student’s time management score (average) to the peers inhis/her program.

14

Page 15: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 3. Feedback on academic skills

Advice for remediation For each of the five skills, the dashboard provides detailed textual guid-ance for remediation. The advice included simple tips, signposts to extensive information andexisting improvement activities provided by the institution, and an invitation to make a personalappointment with a student adviser. To avoid cluttering the initial message in the dashboard, thisactionable improvement guidance is “collapsed”. The information can be uncovered by clicking thesection "How to improve?" (Figure 3.1).

Text parameterization All textual content, including introduction, learning skill information andimprovement tips on each tab, is adapted to the study program and situation of the student based onexperience from the field using text parameterization.

A content-editing platform was built that allows student advisors to adapt messages based ontheir expertise. To facilitate this process, messages are chunked into parts and made editable usingMarkdown, a lightweight text markup language, extended with our own dashboard-specific featureslike @studentName@ to insert the name of the student or @yourGroup@ to embed a part of the chartlegend within the text.

Addition to Rec. 1: Use all available expertise

Student advisers provide invaluable support for designing learning analytics interventions.Their expertise considering the advising process and the typical challenges, experiences, needs,and lines of thought of the involved students. Their input was particular valuable to support theethics of the learning analytics dashboards: which data to display, which messages to deliver, howto refer to actions to take and additional support, etc. Furthermore, they are the experts in theparticular context of the students, which allow them to make small adaptations to the dashboardto tailor them as much as possible to the context of the program at hand.

Rec. 2: Acceptance precedes impact

Before impact can be realized with learning analytics dashboards, dashboards have to find theirway to practice. Therefore, these dashboards first have to be accepted by the stakeholders.The involvement of the student advisers in design of the learning dashboards, and the possibilityto edit the content of the dashboards directly in particular, have created the required acceptance.

3.2 Future students - POS dashboard

The text below is based on the STELA project case study published in [6].

Within the STELA project a learning dashboard providing feedback to students in the transitionfrom secondary to higher education was developed. The dashboard, named POS, aimed at providingfeedback to participants of the multi-institutional positioning test [18, 19], which tests the mathemat-ical skills of aspiring STEM students the months before entering into higher education.

Demo POS learning analytics dashboard

An online demo of the POS dashboard providing actionable feedback to aspiring students afterparticipation to the positioning test is available at:https://feedback.ijkingstoets.be/ijkingstoets-11-ir-ignore/feedbackcode: 11ir0demo

15

Page 16: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 3. Feedback on academic skills

Context Since 2011, the Flemish universities offering engineering bachelor programs have joinedefforts for organizing a ‘positioning test’, a diagnostic test for the prospective students’ ability tosolve math problems [8, 19]. The multiple choice test is organized in the summer between the end ofsecondary education and the start of higher education. Since 2015, a facultative questionnaire includingthree scales (concentration, motivation, and time management) from the Learning and Studying skillsInventory (LASSI) [20] is appended to the positioning test.

The context of the dashboard is challenging and involves not only one institution, but also externalstakeholders from other (competing) universities. The target audience of aspiring STEM students hasno formal relationship with the institution yet at the time of distribution of the learning analyticsdashboard.

Dashboard The POS dashboard (Figure 3.4) contains feedback on three dimensions:

• the positioning test itself,

• the prior academic achievement (measured using a self-reported questionnaire),

• learning and studying skills (measured using three scales of the LASSI questionnaire [20]).

The learning and studying skills are of interest in this chapter. The screenshot of Figure 3.4 shows thepart of the POS dashboard dedicated to the learning and studying skills. Three scales of the LASSiquestionnaire were used: concentration, motivation, and time management.

The intervention regarding learning and studying skills in the POS dashboard has a similar designas the one of the LASSI dashboard. It focuses on three parts:

1. positioning the learning skills of the participants to the positioning test with respect to the otherparticipants (first ‘card’ containing the first unit chart in Figure 3.4), and

2. showing the importance of the learning skills by showing the relation between the learning skilland the study success of earlier participants in the first year (second ‘card’ containing the secondunit chart Figure 3.4,

3. providing advice for remediation (third ‘card’ titled“Tips om aan je tijdbeheer te werken” inFigure 3.4).

Content contribution by student advisers As with the LASSI dashboard, the feedback dash-board was designed in close collaboration with student advisers involved in the positioning tests. Oncethe dashboard was given shape, a preview version was made available to them. Using a set of dummyfeedback codes, student advisers could run different scenarios, seeing the dashboard through the eyes ofstudents with profiles of different strength. Student advisers were able to adapt each of the dashboard’scomponents (‘cards’) in response to specific scenario’s by simply CTRL+double-clicking the part tobe edited. This opens a scenario-based parametrization interface. All information is saved andedited in Markdown format, extended by a set of dashboard-specific keywords, e.g. @numberCorrect@for the number of positioning test questions answered correctly by a student or @mathGroup@ for thesecondary school math result group the student belongs to. Some keywords, like @green@ and @red@are available to include an in-line legend for the charts within the text.

3.3 Three underlying principles

Principle 1: Actionable feedback

The LASSI and POS dashboard provide feedback on learning and studying skills. The measuredskills are indicators for areas that offer a perspective for growth and mitigation. Within thedashboard particular pointers are provided for remediation ranging from small tips and tricks,over additional training, and pointers for personal conversations with the student adviser.Furthermore, the LASSI dashboard has within the project already been applied at two institutes:

16

Page 17: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 3. Feedback on academic skills

Figure 3.4: Full-screen view of the dashboard for a user navigation to the main category ‘learning skills’(1), drilling down further to the feedback about ‘time management’. First, the individual student’sscore is put into context (2) and compared to the scores of other students participating in the test.Second, the score is compared (3) to last years’ students and their rate of study success in the program.At the bottom (4) a series of tips is provided for the student wishing to improve, alongside links toresources available in the different universities [6].

17

Page 18: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 3. Feedback on academic skills

KU Leuven (26 programs) and TU Delft (two programs).

Principle 2: Good return-on-investment

LASSI and POS contain in fact the same intervention regarding feedback on learning and study-ing skills. This shows that the intervention is both usable for first-year students (LASSI) andaspiring students (POS). Thanks to content-editing platforms (allowing student advisers to makeadaptations to the dashboard to tailor them to the particular context of the program) and text-parameterization (automatically filling in the right program name, etc.) the dashboards can beeasily scaled over different programs, allowing to reach a wide range of students with limitedinvestments.

Principle 3: Social comparison

The LASSI and POS dashboard allow the user to compare the own learning and studying skills tothe entire peer group (for LASSI: other first-year students of the same program; for POS: otheraspiring students that participated in the positioning test).

18

Page 19: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 4. Feedback on academic achievement

Chapter 4

Feedback on academic achievement

This chapter elaborates on two learning dashboard interventions that provide student with feedbackon their academic achievement during the transition from secondary to higher education.

4.1 First-year students and beyond - REX dashboard

The text below is based on the STELA project case study published in [4].

Within the STELA project a learning dashboard providing feedback regarding academic achieve-ment, named REX, was designed, developed, and deployed.

Demo REX learning analytics dashboard

An online demo of the REX dashboard providing actionable feedback regarding academic achieve-ment is available at:https://learninganalytics.set.kuleuven.be/static-demo-rex/.

Context The goal of the REX dashboard is to support students in the higher education by providingthem with the feedback on exam results, encouraging reflection and adding actionable recommenda-tions for improvement. The dashboard was to demonstrate how simple, “small data” that is readilyavailable within institutions can be used as a starting point to fuel the organizational learning processaround LA and to collect the valuable feedback about the intervention.

On the course level, several reasons may lead to a certain degree of ambiguity when student, andespecially newcomers, in higher education receive their grades. Often, students are not “graded on acurve”, meaning that scores are not relative to those of other students. Moreover, different universitiesuse different rating scales and threshold for passing. Moreover, in many study programs, it is commonfor distributions of scores to differ significantly from one course to another. Most secondary schoolsuse different grading scales and even in relative terms, the comparability of results in secondary andhigher education is limited. On the program level, students need to get acquainted with a crucialkey performance indicator or measure for global study progress, at KU Leuven this is for instance thestudy efficiency: the ratio of credits a student passed throughout the years to the total number ofcredits taken within the study program. The key performance indicators are often used in regulationsthat are forcing students to keep progressing throughout their study program (for an example of KULeuven see 1).

Dashboard The REX dashboard visualizes exam results on the level of individual courses and thestudy program.

The dashboard (Figure 4.1) is composed out of five tabs:1https://www.kuleuven.be/english/education/student/studyprogress/Refusalregistration

19

Page 20: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 4. Feedback on academic achievement

Figure 4.1: Screenshot of a student’s view on the REX dashboard providing feedback on academicachievement.

20

Page 21: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 4. Feedback on academic achievement

1. Introduction (“Inleiding”): the first tab contains an introductory text, explaining the objec-tives, and contents of the other four tabs. It includes a message explaining that scores are onlya partial view of who the students are and provide a direct link to the study counselor and/orsupport service assigned to the study program.

2. Scores per course (“Vakken”): the second tab contains the exam results for each course thestudent is enrolled in. The initial view shows a numeric score on a 0-20 scale and prompts theanswer some questions like: “This score is [. . . ] than what I expected after the exam” and “Ifeel [. . . ] about this result.” (the number and actual questions were changed over the differentiterations). When the questions are answered, the score of the student is compared to the scoresof the peer group in a visualization. Within this visualization, the student is positioned in oneof four groups. A second button offers the option to position the individual score more preciselywithin the group, leaving it up to the students to choose whether they would like to see thisinformation or not. This sequence is repeated for each of the enlisted courses. The tab concludeswith an invitation to reflect on what is shown above, some additional remarks to frame what isshown above and an open invitation to get in touch with a student counselor.

3. Global progress (“Globaal”): the middle tab informs about the global study progress. Theupper part summarizes the progress of the student on the program level. It contains a visualiza-tion to put the individual result in perspective, comparing it to other students within the samestudy program. The lower part (see Figure 4) of the “global” tab explains and visualizes how theglobal progress of students within the same study program in previous years relates the numberof years they spent to obtain the bachelor diploma: three years (nominal time), four years orfive or more. Drop-out students are indicated as well. For the category that corresponds best tothe student, this information is repeated in an accompanying text.

4. Tips: the fourth tab’s content is entirely textual and is divided into two parts. The first partcontains four sections on how to process the information contained by the dashboard: “talkabout it”, “realistic expectations”, “learning skills” – including a link to a previous dashboard onlearning and studying skills (Section 3) – and other factors (e.g. personal situation). The secondpart contains actionable advice on how to set goals and tips on how to achieve them.

5. Regulations (“Regelgeving”): the last tab provides a summary of the progress monitoringregulations with respect to the global study progress (see above). As with most of the textualinformation in the dashboard, this information is also available online, but it is fragmentedacross several web pages following an organization oriented structure. The REX dashboardcollects relevant information from different sources and offers an integrated and personalizedview, adapted to the program and situation of the student, and focused on the specific contextof the first exam results.

Content contribution by student advisers and scenario-based parameterization As withthe LASSI and POS dashboard, the feedback dashboard was designed in close collaboration withstudent advisers and student advisers could tailor the REX dashboard to the particular needs of theirprogram through a content-editing platform. Furthermore, text parameterization was used to offerautomatic adaptations to the particular programs and the student at hand (e.g. @program).

21

Page 22: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 4. Feedback on academic achievement

4.2 Future students - POS dashboard

The text below is based on the STELA project case study publised in [6].

Demo POS learning analytics dashboard

An online demo of the POS dashboard providing actionable feedback to aspiring students afterparticipation to the positioning test is available at:https://feedback.ijkingstoets.be/ijkingstoets-11-ir-ignore/feedbackcode: 11ir0demo

As explained in the Section 3.2, within the STELA project a learning dashboard providing feedbackto students in the transition from secondary to higher education was developed. The dashboard, namedPOS, aimed at providing feedback to participants of the multi-institutional positioning test [18, 19]that tests the mathematical skills of aspiring STEM students the months before entering into highereducation.

On top of the learning and studying skills elaborated on in the previous paragraph, the POSdashboard provides feedback on The POS dashboard (Figure 3.4) contains feedback on:

• the positioning test itself, and

• the prior academic achievement (measured using a self-reported questionnaire),

which this section elaborates on.

Feedback on multiple-choice questions of the positioning test The dashboard provides feed-back on each individual question. Figure 4.2 shows the feedback on one particular question. It allowsthe participants to position their reply to the multiple choice question with respect to the replies ofthe other participants.

Feedback on the global positioning test score The feedback on the global positioning test scoreallows the participant to:

• position the participants global positioning test score with respect to the other participants(Figure 4.3),

• assess the relation between the global positioning test score and first-year student success basedon data of earlier participants (Figure 4.4).

Feedback on prior academic achievement Finally, the POS dashboard contains feedback onprior academic operationalized by the percentage of mathematics obtained in secondary educationand the advice received by the teaching board, with respect to the other participants. Similar tothe global positioning test score, the participants can position their prior academic achievement withrespect to the other peers and assess the relation between prior academic achievement and first-yearstudent success based on historic data.

22

Page 23: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 4. Feedback on academic achievement

Figure 4.2: Extract of the dashboard view, focusing on the feedback on a multiple-choice questionin the positioning test. The question and its associated figures and formulas are displayed usingidentical typesetting as on the (paper) test document to improve visual recognition. The unit chartshows the number of test participants selecting answers A-B-C-D or blank respectively. The correctanswer is displayed using green dots, wrong answers using orange dots, blank answers using yellow dotsrespectively. The blue border around answer B indicates that this was the dashboard user’s (incorrect)answer.

23

Page 24: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 4. Feedback on academic achievement

Figure 4.3: Unit chart for the answers of students to one of the positioning test’s multiple choicequestions, within the POS dashboard [6].

Figure 4.4: Unit chart showing how the global positioning test score relates to first-year academicachievement, within the POS dashboard [6].

24

Page 25: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 4. Feedback on academic achievement

4.3 Three underlying principles

Principle 1: Actionable feedback

While the academic achievement that the REX and POS dashboard are providing feedback onacademic achievement, which can’t be changed directly, the feedback in the dashboard is stillconsidered actionable. The feedback targets the underlying academic engagement, learning skills,and individual circumstances influencing academic achievement. By pointing to the factors andby providing tips and references to improve these, the dashboards still call the users to reflectionand action.

Principle 2: Good return-on-investment

REX and POS contain in fact the same intervention regarding feedback on academic achieve-ment. This shows that the intervention is both usable for first-year students (REX) and aspiringstudents (POS). Moreover, REX has been extended to all students in bachelor programs, show-ing its applicability beyond the first-year. Thanks to content-editing platforms (allowing studentadvisers to make adaptations to the dashboard to tailor them to the particular context of theprogram) and text-parameterization (automatically filling in the right program name, etc.) thedashboards can be easily scaled over different programs and different academic years, allowing toreach a wide range of students with limited investments.

Principle 3: Social comparison

The REX and POS dashboard provide opportunities for the users to position their academicachievement with respect to their peers. The REX dashboard offers this opportunity for indi-vidual exam scores and global academic achievement. The POS dashboard accommodates socialcomparison for answers to the different questions of the test, the global positioning test score,and prior academic achievement.

25

Page 26: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 5. Feedback on academic engagement

Chapter 5

Feedback on academic engagement

This chapter elaborates on the NTU Student Dasbhoard, providing first-year students with feedbackon their academic achievement.

5.1 The NTU Student Dashboard

Within the STELA project, Nottingham Trent University (NTU) shared experiences on their StudentEngagement Dashboard, providing students feedback on their engagement and providing tutors withsupport on guiding their students.

The learning analytics intervention, named the NTU Student Dashboard, was developed in collab-oration with the external providers of the tool, Solutionpath. Beside information of the institution’sdata records system, the dashboard focuses on engagement. Engagement is calculated from differentdata sources (see Output on Data collection) using a proprietary algorithm. This section focuses onthe former: engagement data. Engagement scores are numbers that can be displayed graphically toshow trends in engagement over time, whereas engagement ratings are words that add meaning tothe numbers. Engagement ratings of ‘Very Low’, ‘Low’, ‘Partial’, ‘Good’, and ‘High’ categorize thenumeric engagement score.

The student-facing dashboard provides the student with feedback on their engagement using twoformats:

1. summarizing engagement rating, and how it changes over time (Figure 5.1) and

2. comparison of engagement with respect to peers in the program, and how it changesover time (Figure 5.2) .

Figure 5.1: Overview of the summarizing engagement rating of a student over the last five days(mockup) in the Student Engagement Dashboard of Nottingham Trent University.

26

Page 27: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 5. Feedback on academic engagement

Figure 5.2: Overview of the cumulative engagement of a student over time (blue) in comparisonwith the average of other students in the program (red) in the Student Engagement Dashboard ofNottingham Trent University.

Within the project’s team NTU embedded students.

Addition to Rec. 1: Use all available expertise

Students are the end-users of the education provided by the higher education institute andshould therefore not be overlooked. Students can help to assess the implications of the choicesmade and can contribute with their actual experiences in higher education. Furthermore, theywill help to guide the ethical implementation of the learning analytics interventions.

5.2 Three underlying principles

Principle 1: Actionable feedback

The NTU Student Dashboard provides feedback on academic engagement. This feedback is ac-tionable as student can based on the feedback decide to change their engagement with their courseby e.g. attending more classes, handing in assignments in time, use the library more frequentlyfor consulting additional resources, use the Virtual Learning Environment more intensively forcourse preparation, etc.

Principle 2: Good return-on-investment

The NTU Student Dashboard is deployed within the entire Nottingham Trent University. NTUfound that the low engagement as recorded by the dashboard correctly identifies students mostat risk, allowing the university to take action for this students. The action is taken based onbehavior of the student, not based on student background, although engagement showed to be

27

Page 28: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 5. Feedback on academic engagement

related to student background characteristics such as gender and minority groups.

Principle 3: Social comparison

The NTU Student Dashboard allows students to compare their engagement with respect to thepeers in their program.

28

Page 29: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 6. Conclusion

Chapter 6

Conclusion

This chapter summarizes the learning analytics interventions to support the transition from secondaryto higher education, developed within the STELA project by:

• Showing how the three underlying principles of actionable feedback, good return-on-investment,and social comparison theory were applied in the different interventions (Section 6.1) and

• Repeating the recommendations made based on the learning analytics interventions of theSTELA project (Section 6.3).

6.1 Three principles underlying the different interventions

Principle 1: Actionable feedback

That feedback should be actionable was in the heart of the interventions of the STELA project.Within the different interventions two ways to obtain actionable feedback were used:

• Feedback originating from malleable skills or changeable behaviourActionable feedback is most easily obtained if it originates from malleable skills or change-able behaviour. In the project case studies examples of interventions aiming for the formerare:

– the learning tracker, providing feedback on online learning activities over the durationof the online course (Chapter 2),

– the LASSI dashboard and POS dashboard, providing feedback on learning and studyingskills (Chapter 3).

– the NTU student dashboard, providing feedback on academic engagement within aprogram. (Chapter 5).

• Feedback originating from non-changeable characteristics

– the REX dashboard, providing feedback on exam grades and global academic achieve-ment (Section 4.1),

– the POS dashboard, providing feedback on the positioning test and prior academicachievement (Section 4.2).

Although the dimensions on which feedback is provided can’t be changed any more, thedashboards provides feedback on the underlying malleable causes for student success.

The interventions were supporting the actionability of the feedback as the dashboards were in-cluding actual tips for actions to take based on the feedback.

29

Page 30: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 6. Conclusion

Rec. 3: Feedback should be actionable

Data-based feedback should be actionable, i.e. it should allow the user (student, prospectivestudent, teacher, etc.) to take action based on the provided feedback. Or differently stated,feedback is only actionable if it provides an opportunity for improvement to those receiving thefeedback.Feedback can originate from both malleable skills (e.g. student engagement, online learningbehaviour, or learning and studying skills) and non-malleable characteristics (e.g. grade or formeracademic achievement). In the former case, actionable feedback is achieved easily. In the lattercase the interventions should provide feedback on malleable skills related to student success thatunderlying the non-malleable characteristics.

Principle 2: Good return-on-investment

The different interventions within the STELA project were aiming at a good return-on-investmentas this would support the actual application of learning analytics into higher education.The interventions within the project took different approaches:

• Interventions designed, developed, and deployed within the STELA project: LASSI dash-board (Section 3.1), REX dashboard (Section 4.1) POS dashboard (Section 3.2 and Sec-tion 4.2), and the SPOC teacher and course builder dashboard (Section 2.2);

• Existing interventions applied to the new context of transition from secondary to highereducation: the learning tracker (Chapter 2.1); and

• Up and running interventions in the context of transition from secondary to higher educa-tion: the NTU student dashboard (Chapter 5).

Obviously, these different approaches impact the kind of investments made and the return thatcan be expected. At the meta-level the interventions within the project aimed at obtaining goodreturn-on-investment by means of scalability:

• by targeting particular online courses, reaching a lot of users, with tailored-solutions (Chap-ter 2),

• by targeting entire programs at higher education institutes with rather generic, but scalablesolutions such that they can be easily transferred to other programs or higher educationinstitutes (Chapter 3, Chapter 4, Chapter 5).

For further reading on return-on-investment regarding the project’s case studies we refer to [5].

Principle 3: Social comparison theory

Social comparison is at the heart of the STELA learning analytics interventions. What all in-terventions have in common is that social comparison is used that allows students to comparethemselves within a peer group that is “maximally related”. E.g. rather than comparing to otherfirst year students, a student is compared to the other first year students in the same program.

Over the different interventions however, different choices regarding the peer group for com-parison have been made:

• all students in the same program by the LASSI dashboard (Section 3.1), REX dashboard(Section 4.1) POS dashboard (Section 3.2 and Section 4.2) and NTU student dashboard(Chapter 5);

30

Page 31: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 6. Conclusion

• past successful learners by the Learning tracker (Section 2.1).

Some advantages and disadvantages:

• all students: On the positive side students can directly compare to their current peers,e.g. the peers that are taking the same classes, hereby reinforcing the identifiability withthe peer group. On the negative side, social comparison theory has mainly shown positiveeffects for students who are compared to a slightly better peer group, which will not be thecase for every student when using the entire peer group.Moreover, when the peer group contains a lot of students who should not be taken as anexample (e.g. a lot of first-year student will still drop out), the comparison to the entire peergroup might put forward a standard that is too negative. Within the dashboards the latterconcern was alleviated by showing the relationship of past students to student success.

• past successful learners: On the positive side this allows learners to compare to a groupof successful learners, presenting them with a good example. Moreover, earlier research onthe social comparison theory shown that social comparison has especially positive effectwhen comparing to a better peer group.On the negative side, the reference group is a little bit more distant, as it concerns learnersfrom previous runs.

The interventions show that it is not a trivial exercise to select the most appropriate referencegroup for social comparison. Within the interventions the selection of the peer group and assessingthe implications of this choice was supported by both researchers in first-year student success andstudent advisers.

Addition to Rec. 1: Use all available expertise

Student advisers and student success researchers will provide the required expertise whenchoosing a reference group for social comparison. Moreover, they will help to assess the implica-tions of the choices made.

6.2 Sustainability of learning analytics interventions

Within the learning analytics interventions of the project different approaches were taken, which alsoaffect the efforts needed to realize sustainable long-term implementation of the learning analyticsinterventions.

• The Learning Tracker originates from research. Therefore, its implementation focuses onparticular MOOCs that provide sufficient support for research regarding the Learning Tracker.This step is necessary to provide the required support for further implementation.For further continuation the Learning Tracker should become part of the institute’s strategicplan regarding MOOCs and integrated within the IT support for MOOC development.

• The LASSI, REX, and POS dashboard were developed within the framework of this project.They originate from actual needs within the programs and support services of the universitiesbut are also combined with a research-perspective. It is considered to be a bottom-up initiative,strongly supported by the students and student advisers. The dashboard were developed to bewell-integrated with the university’s existing IT-solutions.Fur long-term continuation the dashboard should now be fully integrated in the overall IT-solutions by the university, and continuous effort is required to further embed the dashboardswithin the university’s educational practices. Currently KU Leuven has already integrated learn-ing dashboards within it educational vision and policy plans. Further integration with IT servicesis under negotiation.

31

Page 32: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 6. Conclusion

• The NTU Student Dashboard is part of the institutional policy and embedded with actualeducational practices and IT-solutions. It is considered to be a top-down initiative, with strongsupport from tutors and students.The NTU Student Dashboard is already embedding with the IT-solutions of the university.Continuous effort is required to ensure the good working of the dashboard and full integrationwith the university’s educational practices.

Rec. 4: Long-term deployments require embedding

The experiences from the project show that the developed learn-ing analytics interventions will only be sustainable if they will beembedded with the existing institutional IT-choices, know-how, and educational policy and practices.

• The experience with the Learning Tracker have shown thatresearch teams and educational developers can build the re-quired evidence for learning analytics interventions, support-ing further continuation and long-term deployment of the ini-tiatives.

• The experiences of the LASSI, REX, and POS dashboards have shown that a bottom-upinitiatives, developed within projects, can build the required experience and can alreadyrealize support from the stakeholders, maximal integration with the existing institutionalIT-choices and know-how. However, for long-term continuation the involved institutesshould really embed the developed learning analytics dashboards within their IT-supportand overall educational vision and practices.

• The experience of the NTU Student Dashboard has shown that an top-down institu-tional policy around learning analytics can be successful provided that the different stake-holders are integrated within the overall project, and that the initiatives are well-embeddedwithin the institutional vision, practices, and IT-solutions. The particular experience ofNTU shows that cooperation with an external technology provider can be beneficial, pro-vided that sufficient effort and care is taken regarding integration within the institution’ssystems and practices, and if continuous support is provided guaranteeing good workingover time.

6.3 Recommendations

This section lists, as a summary, the recommendations resulting from the learning analytics interven-tions to support the transition from secondary to higher education.

32

Page 33: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 6. Conclusion

Rec. 1: Use all available expertise

Teachers and course builders will provide the contex-tual information that is often needed to make sense of thedata provided in online courses. They can add informationon the targeted audience, timing of the course, the setupof the course (e.g. blended learning), etc.Student advisers provide invaluable support for design-ing learning analytics interventions. Their expertise con-sidering the advising process and the typical challenges,experiences, needs, and lines of thought of the involvedstudents. Their input was particular valuable to supportthe ethics of the learning analytics dashboards: which datato display, which messages to deliver, how to refer to ac-tions to take and additional support, etc. Furthermore,they are the experts in the particular context of the stu-dents, which allow them to make small adaptations to thedashboard to tailor them as much as possible to the con-text of the program at hand.

Student advisers and student success researchers will provide the required expertise whenchoosing a reference group for social comparison. Moreover, they will help to assess the implica-tions of the choices made.

Students are the end-users of the education provided by the higher education institute andshould therefore not be overlooked. Students can help to assess the implications of the choicesmade and can contribute with their actual experiences in higher education. Furthermore, theywill help to guide the ethical implementation of the learning analytics interventions.

Rec. 3: Feedback should be actionable

Data-based feedback should be actionable, i.e. it should allow theuser (student, prospective student, teacher, etc.) to take actionbased on the provided feedback. Or differently stated, feedback isonly actionable if it provides an opportunity for improvement tothose receiving the feedback.Feedback can originate from both malleable skills (e.g. studentengagement, online learning behaviour, or learning and studyingskills) and non-malleable characteristics (e.g. grade or former aca-demic achievement). In the former case, actionable feedback isachieved easily. In the latter case the interventions should pro-vide feedback on malleable skills related to student success thatunderlying the non-malleable characteristics.

33

Page 34: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 6. Conclusion

Rec. 2: Acceptance precedes impact

Before impact can be realized with learning analytics dashboards,dashboards have to find their way to practice. Therefore, thesedashboards have to be accepted first by the stakeholders.The involvement of the student advisers in design of the learningdashboards, and the possibility to edit the content of the dashboardsdirectly in particular, have created the required acceptance.

Rec. 4: Long-term deployments require embedding

The experiences from the project show that the developed learn-ing analytics interventions will only be sustainable if they will beembedded with the existing institutional IT-choices, know-how, and educational policy and practices.

• The experience with the Learning Tracker have shown thatresearch teams and educational developers can build the re-quired evidence for learning analytics interventions, support-ing further continuation and long-term deployment of the ini-tiatives.

• The experiences of the LASSI, REX, and POS dashboards have shown that a bottom-upinitiatives, developed within projects, can build the required experience and can alreadyrealize support from the stakeholders, maximal integration with the existing institutionalIT-choices and know-how. However, for long-term continuation the involved institutesshould really embed the developed learning analytics dashboards within their IT-supportand overall educational vision and practices.

• The experience of the NTU Student Dashboard has shown that an top-down institu-tional policy around learning analytics can be successful provided that the different stake-holders are integrated within the overall project, and that the initiatives are well-embeddedwithin the institutional vision, practices, and IT-solutions. The particular experience ofNTU shows that cooperation with an external technology provider can be beneficial, pro-vided that sufficient effort and care is taken regarding integration within the institution’ssystems and practices, and if continuous support is provided guaranteeing good workingover time.

34

Page 35: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Chapter 6. Conclusion

Authors

Jan-Paul van StaalduinenJ. P. vanStaalduinen@ tudelft. nlTU Delft, Netherlands

Tinne De Laettinne. delaet@ kuleuven. beKU Leuven, Belgium

Tom Broostom. broos@ kuleuven. beKU Leuven, Belgium

Philipp Leitnerphilipp. leitner@ tugraz. atTU Graz, Austria

Martin Ebnermartin. ebner@ tugraz. atTU Graz, Austria

Rebecca Siddlerebecca. siddle@ ntu. ac. ukNottingham Trent University, United Kingdom

Ed Fostered. foster@ ntu. ac. ukNottingham Trent University, United Kingdom

35

Page 36: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Bibliography

Bibliography

[1] STELA project, 2017.

[2] Hart Blanton, Bram P. Buunk, Frederick X. Gibbons, and Hans Kuyper. When better-than-otherscompare upward: Choice of comparison and comparative evaluation as independent predictors ofacademic performance. Journal of Personality and Social Psychology, 76(3):420–430, 1999.

[3] Tom Broos, Lauri Peeters, Katrien Verbert, Carolien Van Soom, Greet Langie, and Tinne De Laet.Dashboard for actionable feedback on learning skills: Scalability and usefulness, volume 10296LNCS. 2017.

[4] Tom Broos, Katrien Verbert, Greet Langie, Carolien Van Soom, and Tinne De Laet. Small dataas a conversation starter for learning analytics. Journal of Research in Innovative Teaching &Learning, 10(2):94–106, 7 2017.

[5] Tom Broos, Katrien Verbert, Greet Langie, Carolien Van Soom, and Tinne De Laet. Low-Investment, Realistic-Return Business Cases for Learning Analytics Dashboards: Leveraging Us-age Data and Microinteractions. In Drachsler H. Elferink R. Scheffel M. Pammer-Schindler V.,Pérez-Sanagustín M., editor, Lifelong Technology-Enhanced Learning. EC-TEL 2018. LectureNotes in Computer Science, vol 11082, pages 399–405. Springer, Cham, 9 2018.

[6] Tom Broos, Katrien Verbert, Greet Langie, Carolien Van Soom, and Tinne De Laet. Multi-Institutional positioning test feedback dashboard for aspiring students lessons learnt from a casestudy in Flanders. In ACM International Conference Proceeding Series, pages 51–55, 2018.

[7] Tom Broos, Katrien Verbert, Carolien Van Soom, Greet Langie, and Tinne De Laet. Dashboardfor actionable feedback on learning skills: how learner prole affects use. In Springer LectureNotes in Computer Science (LNCS) series. (Proceedings of the ECTEL 2017 conference; ARTELworkshop), pages 1–15, Tallinn, Estonia, 2017.

[8] Riet Callens and Joos Vandewalle. A positioning test mathematics in Flanders for potentialacademic engineering students . In Proceedings of the 41th Annual SEFI conference, numberSeptember, pages 16–20, Leuven, Belgium, 2013.

[9] Dan Davis, Ioana Jivet, René F. Kizilcec, Guanliang Chen, Claudia Hauff, and Geert-Jan Houben.Follow the successful crowd. In Proceedings of the Seventh International Learning Analytics &Knowledge Conference on - LAK ’17, pages 454–463, Vancouver, British Columbia, Canada, 2017.ACM Press.

[10] Leon Festinger. A Theory of Social Comparison Processes. Human Relations, 7(2):117–140, 51954.

[11] John Hattie and Helen Timperley. The Power of Feedback. Review of Educational Research,77(1):81–112, 3 2007.

[12] Pascal Huguet, Florence Dumas, Jean M. Monteil, and Nicolas Genestoux. Social comparisonchoices in the classroom: further evidence for students’ upward comparison tendency and itsbeneficial impact on performance. European Journal of Social Psychology, 31(5):557–578, 9 2001.

36

Page 37: Learning analytics interventions - STELA · Dashboard Within the STELA project an interactive dashboard that provides students with in-dividualized feedback on their learning skills

Bibliography

[13] Pedro Manuel Moreno-Marcos, Tinne De Laet, Pedro J. Muñoz-Merino, Carolien Van Soom, TomBroos, Katrien Verbert, and Carlos Delgado Kloos. Analyzing factors for predicting success inan admission test from interactions in a SPOC. In Sumitted for the Proceedings of the LearningAnalytics Conference, Tempe, Arizona, 2019.

[14] David Nicol. Assessment for learner self-regulation: enhancing achievement in the first year usinglearning technologies. Assessment & Evaluation in Higher Education, 34(3):335–352, 6 2009.

[15] Masataka Okabe and Kei Ito. How to make figures and presentations that are friendly to colorblind people. University of Tokyo, 2002.

[16] Lisa O’Regan, Morag Munro, Margaret Phelan, Nuala Harding, Geraldine McDermott, SeamusRyan, Mark Brown, Orna Farrell, Moira Maguire, Maguire Gallagher, David Cranny, and ConorMcKevitt. Technology- Enabled Feedback in the First Year. Technical report, 2016.

[17] Maarten Pinxten, Carolien van Soom, Christine Peeters, Tinne De Laet, and Greet Langie. At-risk at the gate: prediction of study success of first-year science and engineering students in anopen-admission university in Flanders—any incremental validity of study strategies? EuropeanJournal of Psychology of Education, 2017.

[18] Jef Vanderoost, Riet Callens, Joos Vandewalle, and Tinne De Laet. Engineering positioning testin Flanders : a powerful predictor for study success? In Proceedings of the 42nd Annual SEFIconference, Birmingham UK, 2014.

[19] Jef Vanderoost, Carolien Van Soom, Greet Langie, Johan Van Den Bossche, Riet Callens, JoosVandewalle, and Tinne De Laet. Engineering and science positioning tests in Flanders : powerfulpredictors for study success ? In Proceedings of the 43rd Annual SEFI Conference, pages 1–8,2015.

[20] Claire E. Weinstein, Stephen A. Zimmermann, and David R. Palmer. Assessing learning strategies:The design and development of the LASSI. Academic Press, 1988.

37


Recommended