Metadata for Assessment resources

Post on 06-May-2015

687 views 1 download

description

18/01/2011 Aspect Metadata Plugfest

transcript

1

Metadata for assessment resources

Muriel FoulonneauTudor Research Centre

Luxembourg

muriel.foulonneau@tudor.lu

2

Who we are

Open source assessment platform (TAO) Used semantic technologies

Online/offline assessment services OECD PISA study (Programme for International

Student Assessment) and PIAAC study (Adult education) – 40+ countries.

School monitoring (Luxembourg, Hungary) Assessment of students awareness on health issues Assessment of the efficiency of documents in

increasing candidates’ skills + of instructional efficiency of specific trainings

Competence assessment for unemployed Adaptive testing for language diagnostic in a language

schoolhttps://www.tao.lu

< RDF >

Who we are (2)

Research projects New types of items (Cogsim), interactive table Item/test quality issues (TAO-QUAL) Formative assessment Attention data

Developments Medical assessment Formative assessment: peer / self assessment

International cooperations ETS (TOEFL, US), Leipzig institute for Science Education (GE),

NIER (Japan) …

3

Item development and management

Manage information about items (classification, …)

Multilingual and cognitive items

WYSIWYG authoring

Different item types/templates(MCQ, Kohs, C-Test, Campus, Cascade, QTI, XHTML, HAWAI, …)

4

Multiple models

Mainly the structure, few metadata Metadata models are up to the test authors

E.g. for cognitive and socio-economic correlations They can have models of competences to relate items to

They do not create much metadata elements + they all have their own model (PhD student on model elicitation)

5

Item Item development and management

Test Test development and management

Test takers Test takers management

Group Group management

Results Results management

Improving the management of resources

Item storage for longitudinal studies, for item model reuse Access rights and security issues Identification of items and tests Item components, including metadata and multimedia

resources (assets) In the item bank or in external multimedia

repositories

Implementation of models Standard models Ontology elicitation

6

Standards for assessment resources

For candidates IMS Learner Information Package?

For populations ?

For classifications (e.g., of skills and competences)

Standard for tests and items : IMS-QTI Many local formats (e.g., HotPotatoes, Moodle, Blackboard) Implementation is very partial in most platforms (CETIS survey 2010,

ICOPER study (D6.1) Exchanging items and tests across platforms Managing items in item repositories No autonomous representation of multimedia resources

7

Descriptive metadata in IMS-QTI

LOM profile +

8

Usage data

Test development and management

Selection of items

Opportunity to sequence items by difficulty, weight, guessing…

IMS-QTI offers a dictionary of usage data

9

Multimedia resources

Item authoring including multimedia resources How to select resources?

Barker (2008) on learning material Non education metadata models

(e.g., DC, MODS, MPEG7) But what is really useful?

=> looking into the expectations of test authors => looking into the risks for items, primarily the risks of bias

Include cognitive and cultural aspects require both information on subjects and multimedia resources

Different roles of multimedia resources in an item (e.g., content visuals vs context visuals), with different impacts => is it possible to capture user generated data / paradata to get

additional information?

10

http://llt.msu.edu/vol5num1/alseghayer/default.html

Need to

Provide standard metadata sets Translation into RDFS

Mapping between metadata models provided by test authors (preferably RDFS models)

Access harmonized data on multimedia resources for the test authoring interface

Collect attention data Have models for aggregating relevant usage data from assessment

items used in formative assessment.

11

More information

http://www.tao.lu

muriel.foulonneau@tudor.lu

Thank you