+ All Categories
Home > Documents > Assessing heterogeneous student bodies using a methodology that encourages the acquisition of skills...

Assessing heterogeneous student bodies using a methodology that encourages the acquisition of skills...

Date post: 09-Dec-2016
Category:
Upload: cecilia
View: 212 times
Download: 0 times
Share this document with a friend
14
This article was downloaded by: [University of Chicago Library] On: 06 August 2013, At: 04:22 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Assessment & Evaluation in Higher Education Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/caeh20 Assessing heterogeneous student bodies using a methodology that encourages the acquisition of skills valued by employers Alicia Perdigones a , José Luis García a , Vanesa Valiño a & Cecilia Raposo a a Department of Rural Engineering, Polytechnic University of Madrid, Madrid, Spain Published online: 06 Jun 2009. To cite this article: Alicia Perdigones , Jos Luis Garca , Vanesa Valio & Cecilia Raposo (2009) Assessing heterogeneous student bodies using a methodology that encourages the acquisition of skills valued by employers, Assessment & Evaluation in Higher Education, 34:4, 389-400, DOI: 10.1080/02602930802071056 To link to this article: http://dx.doi.org/10.1080/02602930802071056 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Transcript

This article was downloaded by: [University of Chicago Library]On: 06 August 2013, At: 04:22Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Assessment & Evaluation in HigherEducationPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/caeh20

Assessing heterogeneous studentbodies using a methodology thatencourages the acquisition of skillsvalued by employersAlicia Perdigones a , José Luis García a , Vanesa Valiño a & CeciliaRaposo aa Department of Rural Engineering, Polytechnic University ofMadrid, Madrid, SpainPublished online: 06 Jun 2009.

To cite this article: Alicia Perdigones , Jos Luis Garca , Vanesa Valio & Cecilia Raposo (2009)Assessing heterogeneous student bodies using a methodology that encourages the acquisition ofskills valued by employers, Assessment & Evaluation in Higher Education, 34:4, 389-400, DOI:10.1080/02602930802071056

To link to this article: http://dx.doi.org/10.1080/02602930802071056

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to orarising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &

Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

Assessment & Evaluation in Higher EducationVol. 34, No. 4, August 2009, 389–400

ISSN 0260-2938 print/ISSN 1469-297X online© 2009 Taylor & FrancisDOI: 10.1080/02602930802071056http://www.informaworld.com

Assessing heterogeneous student bodies using a methodology that encourages the acquisition of skills valued by employers

Alicia Perdigones*, José Luis García, Vanesa Valiño and Cecilia Raposo

Department of Rural Engineering, Polytechnic University of Madrid, Madrid, SpainTaylor and Francis LtdCAEH_A_307271.sgm(Received 1 August 2007; final version received 5 March 2008)10.1080/02602930802071056Assessment and Evaluation in Higher Education0260-2938 (print)/1469-297X (online)Review Article2008Taylor & Francis0000000002008Dr. [email protected]

This work compares the results of three assessment systems used in two Spanishuniversities (the Universidad Politécnica de Madrid and the Universidad Católica deÁvila): the traditional system based on final examinations, continuous assessment withperiodic tests and a proposed system (specially designed for heterogeneous studentbodies) orientated towards motivating students. This third system involved dividing thesyllabus into two different parts: a common core assessed by multiple choice tests, anda specialisation assessed by a literature review, the writing of an article and an oralpresentation. The latter skills are highly valued by employers. The proposed system ledto a greater pass rate than that achieved by students taking similar courses assessed bythe more conventional systems. In addition, the results show that involving students inthe assessment process increases their participation in their studies and generates afeeling of satisfaction and justice.

Keywords: active learning; assessment system; educational innovation; motivation;skills valued by employers

Introduction

Student abandonment of engineering courses is currently a serious problem faced by mostuniversities. According to a report by the Spanish Ministry of Science and Education(MEC 2006), up to 70% of students abandon university engineering diploma programmes,and only 4% manage to complete their studies in the time officially allotted.

Although this abandonment is due in large part to the difficulty of the multidisciplinarysubjects that make up such courses, other factors, such as the teaching methods used, lowstudent motivation and the assessment systems employed, contribute towards the problem.

The major parties involved in education need to make an effort to potentiate studentlearning. Teachers need to modify and modernise their activities both in terms of the teach-ing and assessment methodologies used, and students need to maintain an adequate levelof motivation. They can be helped in this by modifying their attitudes towards the materialtaught (Juanes et al. 2006; Vallim, Farines, and Cury 2006). Many studies have beenpublished on how to increase student motivation (Keeling, Jones, and Botterill 1998).Vallim, Farines, and Cury (2006) propose offering a preparatory course in which studentsfeel themselves to be engineers from the first day of class. Via a series of activities studentsare allowed to verify society’s need for the different subjects they are to learn, and tounderstand some of the problems they may encounter in their professional lives. Forte and

*Corresponding author. Email: [email protected]

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

390 A. Perdigones et al.

Guzdial (2005) introduced a preparatory course for their students of information science,whose level of success was subsequently increased.

In addition to acquiring information students also need to develop skills. The impor-tance of this cannot be denied, and the need to incorporate course graduates into engineer-ing companies has led to changes in classroom teaching methodologies. Currently,methodologies are followed that potentiate certain skills considered important by thesecompanies, such as creativity (Baillie 2002), communication (Akister, Bannon, andMullender-Lock 2000; Peat, Taylor, and Franklin 2005), work in group (Roach and Gunn2002) and the solving of real problems (Dunne and Rawlins 2000; Spurgin 2004; Mi,Shen, and Ceccarelli 2005; Plaza et al. 2005; Cervilla and Zurita 2006; Gomis et al. 2006).In an attempt to adapt to current trends it is now common to include ideas such as thedesign and development of engineering projects (Sklyarov and Skliarova 2005), introduc-tory courses (Pantic, Zwitserloot, and Jan Grootjans 2005; Vallim, Farines, and Cury2006), e-learning systems (Liu and Yang 2005), virtual laboratories (Gomis et al. 2006)and the use of computers and software in teaching methodologies (van Schaik, Pearson,and Barker 2002; McGuire 2005). These approaches increase comprehension of the mate-rial taught and thus improve learning (Plaza et al. 2005; Sklayarov and Skliarova 2005;Smaill 2005).

Assessment is a key element in all teaching–learning processes. This reveals theknowledge acquired by students and tracks their performance during the course. Assess-ment implies a certain collaboration between student and teacher and requires feedback forthe correction of problems (Davies 2002; Yorke 2003; Perdigones et al. 2006a). Theassessment criteria established should be clear and public so that students can understandthe evaluation system to which they are subject. In this way they can measure theirprogress with respect to fixed goals.

A very common problem that occurs during this process lies in the perception studentshave of traditional assessment methods, which are often considered unfair or to have littleto do with the material taught. Students often feel assessment to be the final goal of thelearning process – to its clear detriment (Savin-Baden 2004). In addition, a sensation offailure is generated that can even affect students’ personal lives. Ling, Heffernan, andMuncer (2003) performed a study in which students analysed the reasons for failure intheir studies; illness, inadequate dedication to study and fear of exams were some of theproblems highlighted.

Currently, numerous assessment systems are used with the aim of motivating students(Smaill 2005; Juanes et al. 2006; Perdigones et al. 2006b; Trotter 2006). The versatilityand possibilities offered by multiple choice examinations has led to their careful studyfrom a pedagogical point of view (Fawkes et al. 2005). Other authors have analysedhow long they should be (Burton 2006), the time needed to complete them, their level ofdifficulty and their coverage of important concepts (Hwang, Yin, and Yeh 2006).

In any event, all assessment systems need to be adapted to the needs of the classroomand the education system being followed (Pratt 1997).

The Spanish university system: current problems

The Spanish university education system is composed of three blocks between whichthe contents of study plans leading to official qualifications are distributed. These threeblocks correspond to core courses (the minimum contents of which are established bythe government and are common throughout the country); this guarantees that the samebasic education is received by all students working towards the same qualification at all

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

Assessment & Evaluation in Higher Education 391

universities, non-core courses (including obligatory and optional subjects for final qualifi-cations; these are determined by each faculty) and elective courses (proposed by eachuniversity for students’ free configuration of their studies).

The courses in the last of these blocks are associated with a number of pedagogicalproblems derived from the educational diversity of the students that take them (i.e. somemay be studying engineering while others are studying law). Students often abandon thesecourses (and therefore fail them) and classrooms are often largely empty, etc.

In large universities, the classes in which these courses are taught are homogeneous interms of the training and knowledge of students. Also, there are many elective coursesavailable. In contrast, in smaller universities, the number of teachers is reduced, and there-fore the number of elective courses on offer is smaller. In addition, these courses have alimited number of places available. This combination of factors often means that studentswill have to take courses for which their main area of study leaves them ill equipped.Elective courses are therefore a problem for smaller universities. The students that takethem come from different areas of knowledge and will necessarily have different concernsabout their content, and different needs and goals related to the qualifications they seek.

Aims

The main aim of the present study was to analyse an alternative methodology for assessingstudents when taking courses attended by a heterogeneous student body. This methodol-ogy has the goals of increasing the motivation of students, their participation in their stud-ies, and therefore, the ultimate success of the course. To adapt to the assessment demandsof a heterogeneous student body, the syllabus was divided into a common core (assessedby multiple choice tests), and a specialised area designed for each type of student (assessedby a literature review, the writing of an article and an oral presentation).

Currently, teachers of university courses are assessed by the number of publicationsthey have in international journals and the importance of the journals selected (Jin,Rousseau, and Sun 2005). The proposed assessment system imitates this by requiringstudents to send the article they write to a technical magazine. This teaches them tostructure their reports (a skill highly valued by employers) and provides the additionalmotivation to being able to improve their curricula during their studies. Additional aimsof this study were to compare the results obtained by students on this course with thoseon other courses (at both universities) employing different assessment systems, and todetermine the factors involved in course success.

Methods

This study involved four elective subjects taught in the academic years 2004/05 and 2005/06 at the Universidad Politécnica de Madrid (UPM), and two elective subjects taughtduring 2005/06 at the Universidad Católica de Ávila (UCAV). These universities havevery different characteristics and face different problems. The total number of studentsinvolved was 272.

Universities and subjects studied

Universidad Politécnica de Madrid

The UPM is a public university in Madrid, the capital of Spain. It has a strong tradition inengineering and enjoys prestige in the area. In 2004/05 it had some 36,585 students

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

392 A. Perdigones et al.

studying the 20 degrees and diplomas offered. Among these is agricultural engineering,which has been taught at this university for 150 years.

The subjects analysed at the UPM were the elective subjects taught by the ElectronicsUnit of the Escuela Técnica Superior de Ingenieros Agrónomos: Automatic Control ofInstallations, Electrical Installations in the Food Industry, Low Voltage ElectricalInstallations, and Electronics Applied to Agriculture. Table 1 shows the characteristics ofthese courses.

Universidad Católica Santa Teresa de Jesús de Ávila

This is a private university in the Province of Avila (Spain) which was founded in1999. In 2004/05 it had 696 students studying the 11 degrees/diplomas it offers, as wellas combinations of these leading to joint qualifications. It is therefore a small univer-sity with a small number of teachers (78 in total, including full time, part time andassociate teachers). Its small size allows it to more easily use alternative assessmentsystems.

At this university, the study involved two subjects (Table 1): Automatic Control ofInstallations, and Electrical, Electronic and Telecommunications Installations inBuildings.

Assessment systems

The traditional assessment system

The assessment system used for the courses Automatic Control of Installations (UPM) andElectronics Applied to Agriculture (UPM) is based on a final theoretical/practical exami-nation that provides 100% of the course grade. This is the system that has traditionallybeen used in this university. Feedback in this system is nil; students are simply informedof their final grades. There is no intermediate monitoring, and therefore neither teachersnor students have any knowledge of possible problems.

Continuous assessment system

This system was used for the courses Electrical Installations in Agriculture and the FoodIndustry (UPM), Low Voltage Electrical Installations (UPM), and Automatic Control ofInstallations (UCAV). The details of the system differed slightly between the twouniversities. At the UCAV the final grade was based on a final exam with theoretical andpractical questions, plus continuous assessment. For the latter, a small, objective multiplechoice test (10 questions) on theory was given at the end of each class, covering thematerial taught that day and on previous days. A total of eight of these tests was given. Themean of the scores obtained was converted into a coefficient (maximum 1.2); the score onthe final examination was then multiplied by this coefficient to provide the final grade.Each week the students were told their grades and the errors they had made; correctionswere also offered.

Continuous assessment at the UPM was similar. Monthly multiple choice testswere given, but the students also performed assessed practical work and undertook aproject. The mean of these scores was converted to a coefficient (maximum 1.2). Thescore on the final examination was then multiplied by this coefficient to provide the finalgrade.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

Assessment & Evaluation in Higher Education 393

Tabl

e 1.

Cou

rses

inv

olve

d in

the

pre

sent

stu

dy, a

nd t

heir

bas

ic c

hara

cter

isti

cs.

Cou

rse

Uni

vers

idad

Aca

dem

ic

year

Num

ber o

f st

uden

tsT

ype

of

cour

seN

umbe

r of

cr

edit

sH

ours

Typ

e of

ass

essm

ent

Aut

omat

ic c

ontr

ol o

f in

stal

lati

ons

2004

/05

2005

/06

25 18O

ptio

nal

330

Tra

diti

onal

Ele

ctri

cal

inst

alla

tion

s in

agr

icul

ture

and

the

foo

d in

dust

ryU

PM

2004

/05

2005

/06

59 44O

ptio

nal

4.5

45C

onti

nuou

s as

sess

men

t

Low

vol

tage

ele

ctri

cal

inst

alla

tion

s20

04/0

520

05/0

634 33

Opt

iona

l6

60C

onti

nuou

s as

sess

men

t

Ele

ctro

nics

app

lied

to

agri

cult

ure

2004

/05

2005

/06

18 15E

lect

ive

4.5

45T

radi

tion

al

Aut

omat

ic c

ontr

ol o

f in

stal

lati

ons

2005

/06

18E

lect

ive

330

Con

tinu

ous

asse

ssm

ent

Ele

ctri

cal,

elec

tron

ic a

nd te

leco

mm

unic

atio

ns in

stal

lati

ons

in b

uild

ings

UC

AV

2005

/06

8E

lect

ive

330

Pro

pose

d sy

stem

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

394 A. Perdigones et al.

Proposed assessment system

This new system, which involves the active participation of the students, was developedas a pilot experiment for the academic year 2005/06. It was used to assess the electivecourse Electrical, Electronic and Telecommunications Installations in Buildings taught atthe UCAV. This course was attended by students studying for four different qualifications(Table 2).

The syllabus was divided into two parts; a common core, taught via lectures (in whichall students received the same information), and a specialisation chosen by each student,directed using the technique of guided work. This personalised the course.

With respect to the core syllabus, lectures were given using new information andcommunication technologies (ICTs; employing Power Point presentations, overheadprojectors and software for electrical calculations). A basic bibliography was supplied, aswell as a list of large companies in the sector and their web addresses. At the end of eachcore syllabus class, multiple choice tests of 10 questions with two possible answers foreach were given. These tests covered the material taught on the day and on previous days.A mean for all the test scores obtained was calculated; this provided a maximum of 4points out of a total of 10.

With respect to the specific material studied by each student, the major qualificationsought was taken into account. Electrical installations are to some extent present in nearlyall walks of life. All domestic, industrial and agricultural installations require electricity.In addition, the rapid advances being made in electrical technology (first the discovery ofsemiconductors, followed by the development of analogue electronics and then digitalelectronics) has led to the appearance of more electronic and telecommunications installa-tions. Nearly all households and tertiary sector buildings have electronic and telecommu-nications devices (domotic and inmotic devices, respectively), as do means of transport,roadways, computer systems, equipment in agricultural and silvicultural production lines,milking equipment, animal identification systems, etc. Other possibilities of this courseinclude an orientation towards legal aspects, such as the environmental consequences ofthese installations. This type of specialisation might be useful to students coming from lesstechnical or non-technical courses (e.g. those studying environmental sciences or law). Sono excessive problems were found in personalising the course to student needs. Thus, a listof possible study choices was provided, organised by and orientated towards the mainqualification being sought (Table 2). The students selected one topic and undertook aguided project. Assessment was divided into three blocks, which together gave the remain-ing 6 points of the final mark. These blocks, which were assessed independently, consisted

Table 2. Topics proposed by main qualification.

Qualification Topics proposed

Agricultural engineering (five years); Diploma in agricultural engineering (three years)

Electrical installation for a parkAutomating stock raising farmsAutomating irrigation

Forestry engineering (five years); Diploma in forestry engineering (three years)

Automating a sawing plantBiosensors

Information technology engineering (five years)Voltage stabilisingSerial ports

Diploma in industrial engineering (three years)Electronic startersAutomating lighting in installationsDomotics

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

Assessment & Evaluation in Higher Education 395

of a search for information via a literature review, the writing of an article for a technicalmagazine and an oral presentation. A due date was set for the turning in the literaturereview and the article in order to help students organise their time.

For the assessment of the literature review, the quality of the information collected wastaken into account, as was the variety of sources and the degree of specialisation shown.The maximum score available was 1 point.

The article had to show a clear, formal and referenced synthesis of the informationcollected with the aim of it being published in a technical magazine under the name of thestudent. This part of the assessment can therefore easily be made to involve anonymous,external agents whose assessment of the work will be fully objective. If the article isaccepted the student receives the maximum score of 3 points. If rejected, the work isassessed by the teacher.

Finally, assessment culminates in each student’s oral presentation of his/her work. Aday was set for presentations to be made: the last day of the course. The use of differentICTs was encouraged. The time set for this presentation was 15 minutes.

All the students who were in class on that day took part in the assessment of oneanother. A questionnaire was given to each student (n = 8 including the teacher) with aguide explaining the aspects of the presentation they should assess: quality of the material(order, format, design of the presentation, work invested and overall impression) and thepresentation (clarity, interest, communication with the public and general impression).Thus, the students evaluated their peers. The score was calculated as the arithmetic meanof all the aspects assessed by each student and the teacher. The maximum score possiblewas 2 points.

Each of the assessed parts of the course was therefore evaluated differently and bydifferent agents (Table 3). This methodology and assessment system meant that studentscould decide whether or not they needed to take all the assessed components in order toachieve a passing grade (minimum 5 points; the sum of the points obtained in each

Table 3. Assessment of the proposed system’s work blocks.

Assessment

Assessment tool CharacteristicsMaximum

score Assessor Form of assessment

Multiple choice test

Objective tests taken at the end of each class

4 Teacher The mark obtained is the mean of all tests taken.

Literature review A research and information gathering assignment

1 Teacher Subjective assessment of the quantity and quality of the material presented.

Writing of an article

The writing of an informative article

3 Technical Magazine

A maximum score is awarded if the article is accepted. If no response is received or the work is rejected, assessment is made by the teacher (or an outside expert).

Oral presentation Presentation of work to the group. This lasts a maximum of 15 minutes

2 Students and teacher

A standard guide was provided to the students so that they could assess the presenter using common criteria. The final mark is the arithmetic mean of the scores awarded.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

396 A. Perdigones et al.

assessment). Marks were given weekly so that students could follow their progress at alltimes.

To evaluate the appropriateness of the proposed assessment system, the following weretaken as indicators of quality: degree of student participation, attendance, the students’perceptions of their final grades and the percentage pass rate (i.e. the percentage ofstudents who passed the course). The pass rate was also calculated for each of theremaining courses.

Results and conclusions

The elective course Electrical, Electronic and Telecommunications Installations inBuildings, offered by the UCAV and assessed using the proposed system, had a failure rateof just 12.5%; no students abandoned the course. The pass rate was therefore 87.5%(Figure 1).Figure 1. The degree of participation (i.e. the number of students involved) in the different taskswas high (Figure 2). All took part (to a greater or lesser degree) in the assessment of thecore topics (multiple choice) and the article writing. A problem detected was the longresponse time of the receiving journals, which might require redefining the date on whichfinal grades are released. One of these replies came in time for assessment to be completed:the article of the students was published (García and Gómez 2006); for the rest of thestudents, scores were therefore awarded by the teacher. This, of course, does not precludearticles from eventually being accepted and published at a later date – which is equallyuseful on the curriculum vitae of the author. Daily class assistance in connection with themultiple choice tests was 62.2%. Some 25% of students attended 85% of the classes. Only12.5% of the students attended less than half the classes. By way of comparison, the meanlevel of attendance on the course Automatic Control of Installations run by the sameuniversity was 56% (the only other course for which such data were available) (Figure 3).ANOVA showed this difference in attendance not to be significant. Thus, the course with

Figure 1. Percentage failure rate for four courses (two taught at the UPM and two at the UCAV)with different assessment system.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

Assessment & Evaluation in Higher Education 397

the new assessment system enjoyed a high attendance rate but similar to that of the latter,continuously assessed course.Figure 2.Figure 3. Interviews were held to determine the level of student acceptance of the proposedmethodology and their perception of their final mark. The only student who failed thecourse complained. However, the characteristics of the system allowed the teacher to showthis student that the literature review was the only component evaluated subjectively bythe teacher, and that this was worth only 1 point of the final grade. The multiple choicetests were objective and the remaining components involved other assessing parties. Ingeneral, the students felt that their grades were fair; assessment was more objective and thestudents felt more directly responsible for the grades obtained.

Assessment with feedback via weekly tests not only motivated the students but alsomade them spread their effort out over the term. This is of use to the teacher since it

Figure 3. Percentage attendance of student on the UCAV courses Electrical, Electronic andTelecommunications Installations in Buildings (EETI) and Automatic Control of installations (AC).

Figure 2. Student participation rate (number of students taking part) in the different tasks per-formed in the proposed assessment system, and mean final grades.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

398 A. Perdigones et al.

provides information about the academic level of the class and the degree of comprehen-sion of the material taught; this feedback allows the course to be modified to meet theneeds of the group. In addition, the students have daily reference to their grades; this notonly constantly motivates them but also invests them with a feeling that their work hasbeen justly assessed. The students showed particular interest in the idea of publishing anarticle; this part of the assessment was key in motivating them.

The proposed system also seemed to have a positive influence on the number offailures and abandonments, although other factors also influence these variables. When allthe courses analysed were compared, the main factor influencing the pass rate was foundto be the number of students on the course. Figure 4 shows the variation in the pass ratefor the six elective courses taught (note that some were taught on two occasions), with theirdifferent assessment systems, with respect to the number of students taking them. Thegreater the number of students, the lower the pass rate.Figure 4. The proposed system would seem to be very useful for assessing students takingelective courses when the student body is small and of heterogeneous background – andtherefore has different needs and levels of interest in the course material. In addition, thissystem helps students develop skills valued highly by society and employers, such asautonomy, the ability to search for information, work analysis, the writing of a finaldocument, critical assessment, exhibition skills and oral communication.

If used more widely, the proposed system may be useful in curtailing the high numberof students that currently abandon engineering courses.

AcknowledgementsThis work was undertaken by the Grupo de Innovación Educativa en Tecnologías Eléctricas yAutomática de la Ingeniería Rural (IE-TEA), at the Universidad Politécnica de Madrid (Spain).

Notes on contributorsAlicia Perdigones is a professor at the Department of Rural Engineering, Polytechnic University ofMadrid. She is undertaking a project on assessment methodology.

Figure 4. Pass rate with respect to the number of students on the six courses analysed.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

Assessment & Evaluation in Higher Education 399

José Luis García is a professor at the Department of Rural Engineering, Polytechnic University ofMadrid. He is Head of the teaching group of the Rural Engineering Department.

Vanesa Valiño is an assistant professor at the Department of Rural Engineering, PolytechnicUniversity of Madrid. Her current research is concerned with assessment methodology.

Cecilia Raposo is a professor at the Department of Rural Engineering, Polytechnic University ofMadrid. She has published papers on evaluation and teaching methodologies.

ReferencesAkister, J., A. Bannon, and H. Mullender-Lock. 2000. Poster presentations in social work education

assessment: A case study. Innovations in Education and Teaching International 37, no. 3: 229–33.Baillie, C. 2002. Enhancing creativity in engineering students. Engineering Science and Education

Journal 11, no. 5: 185–92.Burton, R. 2006. Sampling knowledge and understanding: How long should a test be? Assessment

& Evaluation in Higher Education 31, no. 5: 569–82.Cervilla, M.D., and I. Zurita. 2006. Formación jurídico-civil por medio del desarrollo de commpe-

tencias transversales [Juridical training by competence development]. Paper presented at the3rd Jornadas internacionales de innovación universitaria: Métodos docentes afines al EEES,September 14–15, in Madrid.

Davies, P. 2002. Using student reflective self-assessment for awarding degree classifications.Innovations in Education and Teaching International 39, no. 4: 307–19.

Dunne, E., and M. Rawlins. 2000. Bridging the gap between industry and higher education:Training academics to promote student teamwork. Innovations in Education and TeachingInternational 37, no. 4: 361–71.

Fawkes, D., B. O’Meara, D. Weber, and D. Flage. 2005. Examining the exam: A critical look at theCalifornia critical thinking skills test. Science and Education 14: 117–35.

Forte, A., and M. Guzdial. 2005. Motivation and nonmajors in computer science: Identifying discreteaudiences for introductory courses. IEEE Transactions on Education 48, no. 2: 248–53.

García, A., and L. Gómez. 2006. La automatización en la ganadería. EUREKA-Revista Digital.ISSN: 1578-0341. http://www.ti.profes.net/apieaula2.asp?id_contenido=47826.

Gomis, O., D. Montesinos, S. Galceran, A. Sumper, and A. Sudriá. 2006. A distance PLC program-ming course employing a remote laboratory based on a flexible manufacturing cell. IEEETransactions on Education 49, no. 2: 278–84.

Hwang, G., P. Yin, and S. Yeh. 2006. A tabu approach to generating test sheets for multipleassessment criteria. IEEE Transactions on Education 49, no. 1: 88–97.

Jin, B., R. Rousseau, and X. Sun. 2005. Upgrading laboratory status. Key labs and open labs in theChinese scientific research system: Qualitative and quantitative evaluation indicators. ResearchEvaluation 14, no. 2: 103–9.

Juanes, J.A., A. Prats, J.J. Gómez, M.L. Lagándara, E. Tolosa, M.J. Catalán, M. Moya, M.J.Rodríguez, and M.A. Velasco. 2006. Recurso de apoyo a la docencia, para la motivación ymejora del aprendizaje, en ciencias de la salud [Teaching support in sciences of the health].Paper presented at the 3rd Jornadas internacionales de innovación universitaria: Métodosdocentes afines al EEES, September 14–15, in Madrid.

Keeling, D., E. Jones, and D. Botterill. 1998. Work-based learning, motivation and employer–employee interaction: Implications for lifelong learning. Innovations in Education and TeachingInternational 35, no. 4: 282–91.

Ling, J., T.M. Heffernan, and S.J. Muncer. 2003. Higher education students beliefs about the causesof examination failure: A network approach. Social Psychology of Education 6: 159–70.

Liu, H.-I., and M.-N. Yang. 2005. QoL guaranteed adaptation and personalization in e-learningsystems. IEEE Transactions on Education 48, no. 4: 676–87.

McGuire, L. 2005. Assessment using new technology. Innovations in Education and TeachingInternational 42, no. 3: 265–76.

MEC, Education and Science Spanish Ministery. 2006. http://wwwn.mec.es/educa/ccuniv/html/informes_y_estudios/documentos/.

Mi, C., Z.J. Shen, and T. Ceccarelli. 2005. Continuing education in power electronics. IEEETransactions on Education 48: 183–90.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013

400 A. Perdigones et al.

Pantic, M., R. Zwitserloot, and R. Jan Grootjans. 2005. Teaching introductory artificial intelligenceusing a simple agent framework. IEEE Transactions on Education 48, no. 3: 382–90.

Peat, M., C.E. Taylor, and S. Franklin. 2005. Re-engineering of undergraduate science curricula toemphasise development of lifelong learning skills. Innovations in Education and TeachingInternational 42, no. 2: 135–46.

Perdigones, A., J.L. García, C. Raposo, and A. Pérez. 2006a. Propuesta de evaluación mixta en unaasignatura de ingeniería [Mixed evaluation in an engineering subject]. Paper presented at the4th Congreso internacional de docencia universitaria e innovación, July 5–7, in Barcelona.

Perdigones, A., J.L. García, C. Raposo, and A. Pérez. 2006b. Control del aprendizaje de los alum-nos: una necesidad en las ingenierías [Student learning control: A need in engineering]. Paperpresented at the 3rd Jornadas internacionales de innovación universitaria: Métodos docentesafines al EEES, September 14–15, in Madrid.

Plaza, I., C. Medrano, M. Ubé, and A. Blesa. 2005. Quality in the design and development ofdigital electronics practical tasks. International Journal of Electrical Engineering Education42, no. 2: 165–72.

Pratt, D.D. 1997. Reconceptualizing the evaluation of teaching in higher education. Higher Education34: 23–44.

Roach, A., and V. Gunn. 2002. Teaching medieval towns: Group exercises, individual presentationsand self-assessment. Innovations in Education and Teaching International 39, no. 3: 196–204.

Savin-Baden, M. 2004. Understanding the impact of assessment on students in problem-basedlearning. Innovations in Education and Teaching International 41, no. 2: 223–33.

Sklyarov, V., and I. Skliarova. 2005. Teaching reconfigurable systems: Methods, tools, tutorials,and projects. IEEE Transactions on Education 48, no. 2: 290–300.

Smaill, C. 2005. The implementation and evaluation of OASIS: A web-based learning andassessment tool for large classess. IEEE Transactions on Education 48, no. 4: 658–63.

Spurgin, E.W. 2004. The goals and merits of a business ethics competency exam. Journal ofBusiness Ethics 50: 279–88.

Trotter, E. 2006. Student perceptions of continuous summative assessment. Assessment & Evaluationin Higher Education 31, no. 5: 505–21.

Vallim, M.B.R., J.-M. Farines, and J.E.R. Cury. 2006. Practicing engineering in a freshmanintroductory course. IEEE Transactions on Education 49, no. 1: 74–79.

van Schaik, P., R. Pearson, and P. Barker. 2002. Designing electronic performance support systemsto facilitate learning. Innovations in Education and Teaching International 39, no. 4: 289–306.

Yorke, M. 2003. Formative assessment in higher education: Moves towards theory and theenhancement of pedagogic practice. Higher Education 45: 477–501.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

hica

go L

ibra

ry]

at 0

4:22

06

Aug

ust 2

013


Recommended