+ All Categories
Home > Documents > Mate, you should know this! Re-negotiating practice after a critical incident in the assessment of...

Mate, you should know this! Re-negotiating practice after a critical incident in the assessment of...

Date post: 20-Jan-2017
Category:
Upload: heath
View: 214 times
Download: 0 times
Share this document with a friend
14
ORIGINAL PAPER Mate, you should know this! Re-negotiating practice after a critical incident in the assessment of on-job learning Karen Vaughan & Andrew Kear & Heath MacKenzie Received: 16 January 2014 /Accepted: 5 June 2014 # Springer Science+Business Media Dordrecht 2014 Abstract This article examines a critical incident during research investigating a new assessment system for on-job learning in carpentry. The system was designed to establish clear relationships between supportive learning environments and purposeful, professional assessment of learnersprogress through naturally occurring evidenceon the building site. However an unfortunate set of circumstances produced an assessment perfect stormfor a training advisors workplace visit. It provoked distress and disquiet for all the people involved: the apprentice, the employer, the training advisor/assessor, the moderators, and the researcher. The article frames the incident as a catalyst for critical reflection and shared learning. It argues that certain features in the system, including an on-job assessment community of practice based around social moderationof assessment judgements, helped create both the critical incident and the subsequent renegotiation of practice and realignment of relationships. Keywords Community of practice . On-job learning . Assessment . Social moderation . Critical incident . Workplace learning Introduction This article examines a critical incident during research into an Industry Training Organisations new system for assessing on-job learning in carpentry in New Zealand. The incident involved assessment of a carpentry apprentices competence and did not go as planned, disturbing all those involved. It triggered a number of changes to relationships and practices, including ours as this articles authors respectively the researcher, the architect of the assessment system, and the assessor/training advisor. Vocations and Learning DOI 10.1007/s12186-014-9118-8 K. Vaughan (*) New Zealand Council for Educational Research, Wellington, New Zealand e-mail: [email protected] A. Kear : H. MacKenzie Building and Construction Industry Training Organisation, Wellington, New Zealand
Transcript

ORIGINAL PAPER

Mate, you should know this! Re-negotiating practiceafter a critical incident in the assessment of on-joblearning

Karen Vaughan & Andrew Kear & Heath MacKenzie

Received: 16 January 2014 /Accepted: 5 June 2014# Springer Science+Business Media Dordrecht 2014

Abstract This article examines a critical incident during research investigating a newassessment system for on-job learning in carpentry. The system was designed toestablish clear relationships between supportive learning environments and purposeful,professional assessment of learners’ progress through “naturally occurring evidence”on the building site. However an unfortunate set of circumstances produced anassessment “perfect storm” for a training advisor’s workplace visit. It provoked distressand disquiet for all the people involved: the apprentice, the employer, the trainingadvisor/assessor, the moderators, and the researcher. The article frames the incident as acatalyst for critical reflection and shared learning. It argues that certain features in thesystem, including an on-job assessment community of practice based around “socialmoderation” of assessment judgements, helped create both the critical incident and thesubsequent renegotiation of practice and realignment of relationships.

Keywords Community of practice . On-job learning . Assessment . Social moderation .

Critical incident .Workplace learning

Introduction

This article examines a critical incident during research into an Industry TrainingOrganisation’s new system for assessing on-job learning in carpentry in New Zealand.The incident involved assessment of a carpentry apprentice’s competence and did notgo as planned, disturbing all those involved. It triggered a number of changes torelationships and practices, including ours as this article’s authors – respectively theresearcher, the architect of the assessment system, and the assessor/training advisor.

Vocations and LearningDOI 10.1007/s12186-014-9118-8

K. Vaughan (*)New Zealand Council for Educational Research, Wellington, New Zealande-mail: [email protected]

A. Kear : H. MacKenzieBuilding and Construction Industry Training Organisation, Wellington, New Zealand

The article argues that it was the design and nature of the social learning through thenew assessment system itself, together with the role of the researcher, that helped createboth the critical incident and the subsequent learning for individuals and community ofpractice. Understanding of the initial incident was transformed by drawing on the“social moderation” (Gipps, 1994) arrangements of the assessment system and itsemphasis on rich conversations, intersecting with participation in a research projectbased around professional assessment conversations.

Researching systems for the assessment of on-job learning

The critical incident occurred during research focused on a newly-developed system forthe assessment of on-job learning in carpentry apprenticeships. The research wasdesigned to investigate the ways in which the system operationalised good practiceprinciples for assessment systems dealing with on-job learning. The good practiceprinciples at the heart of this research had been developed through previous researchinto on-job assessment systems in use throughout the New Zealand industry trainingsector. That previous research had found that industry educators and managers oftenfailed to understand assessment as an expertise distinct from that of teaching, and thatthey struggled to develop systems that could support robust assessment judgements andpromote good learning processes, as well as outcomes (Vaughan & Cameron, 2009,2010a). The previous research then produced a guide to good practice which set outfour non-prescriptive, high-level principles with self-review questions and fictionalexemplars to help Industry Training Organisations (ITOs) develop and maintain goodassessment systems for their specific and unique industry contexts (Vaughan &Cameron, 2010b). The principles were:

& ITOs and employers need a clear and shared understanding of the purpose forassessment so that the right knowledge and skills are assessed

& The ITO’s assessment systems must support the learning process so that learnersbecome active participants in assessment

& Good assessment systems require appropriately recruited, training, and profession-ally developed people who understand that the skills of the trainer and those of theassessor are complementary, but different

& Moderation is especially effective in contributing to reliability and validity ofassessment judgements, and to maintenance of competency standards, if it is acollective exercise

The good practice guide was well received by the industry training sector andsubsequently generated demand for an assessment systems planning and design work-shop led by the researchers. For many ITOs, the guide represented a new way ofthinking. For a few others, the guide was a useful way to frame some ideas they alreadyhad, but had been unable to bring to fruition. Opportunely the Building and Construc-tion ITO (BCITO) had established a new assessment system and, while it did notnecessarily arise directly as a result of the guide, it did appear to embody the high-levelprinciples. It thus opened up the possibility of inquiring into how a specific industrymight give life to the four principles and form a cohesive strategy. NZCER, the

K. Vaughan et al.

organisation which conducted the original research, and the BCITO formed a collab-oration to do just that, attracting funding from Ako Aotearoa: the Centre for TertiaryTeaching Excellence.

The BCITO is responsible for leading skills standards and qualifications develop-ment and brokering training, including managing apprenticeships, in the building andconstruction industries. Its new assessment system in carpentry shifted responsibilityfor workplace-based assessment away from thousands of small business employers,who were also trainers, to less than one hundred ITO-employed, regional advisors whoalso had oversight of each employer-apprentice training arrangement. This shift inresponsibility for assessment framed and made possible a new set of arrangements. Thenow small number of assessors, already ITO-employed, could be provided withprofessional development to undertake assessment as a professional activity, ratherthan a technical activity based on marking off answers from rigidly-designedworksheets. Furthermore a team was formed around every apprenticeship. Each mem-ber of the “assessment team” had specific and complementary roles in supporting theapprentice’s learning:

& The apprentice was responsible for learning the trade, and gathering and presentingevidence of their learning and competence;

& The training advisor was responsible for asesssing the apprentice’s competence,overseeing the apprenticeship, and providing guidance to the apprentice andemployer;

& The employer was responsible for teaching the apprentice and evaluating theauthenticity of the evidence presented by the apprentice; and

& The moderator was responsible for moderating consistency of judgement acrossmultiple training advisors and assessment visits, and providing professional devel-opment support to the training advisor.

Research into the new system focused on examining instances of on-job assessmentas expressions of the four good practice principles through the “production space” ofassessment—that is, the formal assessments and informal assessment-related events,interactions, conversations, and reflections generated by on-job assessment that ulti-mately produce learning outcomes. We used a multi-stage, quota sampling technique torecruit five “assessment teams” (training advisor/assessor, evaluator/employer, appren-tice, and moderator) who were most likely to provide us with good opportunities toobserve good assessment practice and systems. Over a 5–7 month period researcherstwice visited workplaces to observe formal assessments and interview members of theassessment team (the moderator only attended one of the two assessment visits so wasonly interviewed once).

Researchers also surveyed all BCITO training advisors/assessors and moder-ators and observed the BCITO's National Moderation Workshop. In addition tointerview and observation data, researchers collected ITO internal documenta-tion, examples of training plans and apprentice work records. Further details ofthe research process and findings are available in the research report (Vaughan,with Gardiner and Eyre, 2012). This article is confined to a discussion of thecritical incident which occurred during the research and which provided every-one with an unexpected learning opportunity.

A critical and disorientating incident

It all began innocently enough. One of the researchers had arranged to observe atraining advisor’s visit to an apprentice. These visits to workplaces (building sites)generally occurred on an as-needed basis but at least four times each year and were akey way that training advisors (now also the assessors) managed each apprenticeship.Management included reviewing the apprentice’s progress and setting goals, fosteringgood relationships with employers and apprentices, and – as was the case with this visit– assessing the apprentice’s competence against the qualification standards.

The visits were typically organised around taking account of “naturally occuringevidence” arising from the “walkaround”, where apprentice and training advisor wouldtour the building site, talking about the apprentice’s work process, reasoning, anddecision-making in relation to the apprentice’s tangible work product and portfoliorecords. The employer’s conversations with the training advisor in advance of, andsubsequent to, the assessment visit, together with evidence from a series of other visits,provided verification of the evidence and additional perspectives about the apprentice’scommercial competence.

This particular assessment visit turned out to be extraordinary in several ways.Firstly, unexpected rain set in and forced a last minute change away from the typicalsetting of the building site at which the apprentice was currently working to the moreunusual (but not entirely unheard of) setting of an office. Secondly there were a numberof observers who would not typically be present. The researcher had arranged toobserve this particular visit and had travelled for nearly two hours to get there. Amoderator overseeing the training advisor’s work had also arranged to observe and tobring along a second moderator, who he was coaching into the role. In addition, theapprentice’s employer used the rain interruption to his work as an opportunity to sit inon the assessment, taking the assessor and apprentice somewhat by surprise. While theassessment visit could have been postponed, the apprentice was eager for a chance togain credits towards his qualification and everyone had already arranged to be there forthe afternoon. The combination of these quirks of fate meant that what normally wouldhave been a fairly low-key assessment event was now marked by criss-crossing lines ofscrutiny and anxiety.

Following introductions, everyone was seated around the room with the trainingadvisor and apprentice together at the main table. The training advisor summarised theprevious visit and anything that had happened since, and introduced the aims of thecurrent visit: to assess the apprentice’s competence in understanding legislation and inusing plans and specifications. The training advisor then initiated a conversation withapprentice about his recent work and learning. Normally this conversation wouldprovide some of the key evidence about an apprentice’s competence.

Unfortunately the apprentice floundered. Without any of the contextual cuesnormally found on the building site, he literally could not point to what heknew and could do. Although plans and specifications can be read anywhere(not only on site), they cannot readily be interpreted just anywhere. In front ofa de-facto panel, and without any authentic work products, tools, or sensoryprompts, the apprentice struggled to demonstrate real understanding of plansand specifications. He could describe his recent actions on site but not theprinciples and reasoning behind them. He also showed little understanding of

K. Vaughan et al.

legislation, a fairly theoretical aspect of the carpentry knowledge, which nowseemed further abstracted in the office-based setting.

The more the apprentice struggled, the more the training advisor stepped up hisefforts to prise evidence of competence out of him. He asked ever more questions,angling each in multiple ways. However the apprentice was now in a deep pit. Itbecame clear that he could not be “signed off” and credited with the standards towardshis qualification. The employer, who had initially tried chipping in with helpfulprompts, had been reduced to dramatic sighing. He was clearly embarrassed by theperformance of his apprentice. “Mate, you should know this! You’re useless!” hepronounced at one point. The two moderators and the researcher sat rigidly by. Wasthere a way to stop this agony? Was it anybody’s role to do so?

It seemed a great relief when the training advisor concluded the assessment andmoved into setting future learning goals with the apprentice. When the formalitiesended, the researcher began conducting some of the one-on-one interviews planned –firstly with the apprentice, followed by the employer, the training advisor, and themoderator. The interviews were designed to allow each person to reflect on theassessment process. As each person did so, they highlighted personal, professional,and system-wide disturbances of the assessment visit. For the researcher, similardisturbances emerged through the process of interviewing that made her wonder abouther own role.

Negotiating new roles

The apprentice expressed humiliation and despondency throughout his interview withthe researcher. He worried that his employer would lose confidence in him. Why, heimplored the researcher, had his study technique of reading through his workbooks atnight failed to translate into articulable knowledge? The assessment event had forcedthe apprentice to question the validity of his epistemic beliefs about learning beingsimple, quick, and certain. (Harteis, Gruber & Hertramph, 2010).

In his interview, the employer described his apprentice’s performance as reflectingpoorly on him as a trainer. He felt frustrated and embarrassed. He also felt unsettledabout the training advisor having a combination of apprenticeship management andassessment responsibilities. “I’m the only one who knows what [the apprentice] canreally do”, he complained. The employer was struggling with the loss of his old role asan assessor and understanding the value of his new role as an evaluator of evidence.Where did he now stand in relation to the training advisor, and how should they worktogether to support the apprentice’s learning?

The training advisor reflected on his own performance in the new role of assessor inhis interview. He acknowledged that he had been “looking too hard” for evidence of theapprentice’s competence. He had been anxious to impress the researcher and themoderators. If it were not for that, he might have asked the moderators and employerto leave the room (but not the researcher). Regardless, he thought he could have steeredthe assessment to an earlier conclusion when it became clear that the apprentice couldnot meet the standard.

The moderator acknowledged in his interview that his normal approach of “trying tokeep out of the way” had been very uncomfortable. His low key presence usually

worked in the “natural” setting of the building site. However in this case it felt entirelyunnatural to be sitting by and not assisting the training advisor – which he occasionallydid, especially since the new moderator positions were taken up by the most capable ofthe training advisors. He was very aware that his presence in the room had contributedto the pressure felt by the training advisor, employer, and apprentice. How could he, inhis moderator role of providing professional development to a regional group oftraining advisors, learn from this in order to support them as well as the trainingadvisor involved in the incident?

The insider-outsider space

Throughout the interviews the researcher found herself confronted by her positioning aswhat research debates refer to as insider and outsider status. Researchers who aremembers of a specified group are “insiders”. They have a familiarity that some arguegives them privileged access to particular kinds of knowledge (Merton, 1972 in Mercer,2007) and better positions them for making meaning (Shah, 2004). On the other handnon-member or outsider researchers may be better placed to avoid the “thin” data thatcomes from familiarity. Such familiarity can lead insider researchers to take things forgranted or not ask the obvious questions, particularly if they are trying to avoid sayingor doing things that might jeopardise their membership of the group (Mercer, 2007).

If we think about insider and outsider status as occupying opposite ends of acontinuum, rather than being dichotomous classifications (Mercer, 2007), we canunderstand the researcher as having a mix of status in relation to the assessment visit.Her outsider characteristics framed her participation as a passive observer, knowing thatshe was causing some of the anxiety in the room with her note-taking and silent,observing presence – activities that positioned her as “other” to the “hands-on” andparticipative culture of construction work and learning. On the other hand, she was notan outsider in the sense of being a total stranger. She was already known to the ITOthrough her previous research; indeed the organisation had specifically sought her toundertake the current project. She was trusted by the training advisors and moderators,whom she had met before. These things, together with her own personal interest andgrowing skills in carpentry work, and the organisation’s endorsement of her to theemployers and apprentices, meant she had a degree of credibility and rapport witheveryone there.

It was this mix of insider-outsider characteristics that afforded her the opportunity toobserve an assessment and talk to the people involved. However this mix was alsoprecisely why she felt so disorientated. Her list of pre-prepared interview questions nowseemed inappropriate. They were focused on getting people to think about the assess-ment process, and whether and how the system reflected the good practice principlesfrom earlier research. Yet these principles had been little in evidence during theassessment she had just witnessed. The questions were now going to be awkwardand insensitive in the face of people trying to process their feelings about what hadgone wrong.

Rather than rigidly pursue the original interview questions, the researcher opened upa more unstructured space with each interviewee. As interviewees variously expressedtheir feelings of humiliation, dismay, and frustration – feelings she had in part provoked

K. Vaughan et al.

– she was pulled into a role that did not quite fit with her image of the dispassionate“outsider” researcher. The employer and apprentice clearly saw her as an “expert” andsought her advice, providing a good illustration of the axiom that “people’s willingnessto talk to you, and what people say to you, is influenced by who they think you are”(Drever, 1995, p. 31 in Mercer, 2007, p. 7). While the researcher never intended toprovide guidance in her role, she did opt to “sow seeds” during the interviews. Sherationalised this as a way to invite more reflection on the part of the interviewees(useful data) and gently steer them towards resources or activities that might help them.Her questions now included: asking the apprentice if he thought he might find activestudy techniques more effective and, if so, whether he had a friend with whom he coulddiscuss the workbook content; asking the employer what he would do to better supportthe apprentice’s learning and better collaborate with the training advisor; and asking thetraining advisor what he would have done differently and what resources he had at hisdisposal. The interviewees took the opportunity to reflect on these questions as theresearcher acted as a “mirror” to their practices.

It had been the assessment “perfect storm”: an off-site setting, an ambitious butunderprepared apprentice, an anxious training advisor, a concerned employer, and agroup of spectators (who were unusually large in number). It had left most of us askinguncomfortable questions about our roles and responsibilities, as well as the processitself. The good news is that, rather than sweeping the experience under the carpet, itbecame part of the assessment system itself. Four months later, the researcher visitedthe same assessment team and they described changes to their understandings andpractices, with the new assessment system playing a key role in making this possible.The researcher, too, had shifted in her understanding of her role as part of a practicesystem.

The disorienting dilemma as catalyst for learning

Transitions, crises, disorienting dilemmas, and mistakes can often be catalysts forlearning and change. They can create disjuncture and incoherence between our beliefsabout how the world is and what our roles and purposes are. Under these circum-stances, individuals and organisations find ways to make meaning from, and reconcile,the incongruities that are exposed or created. Mezirow’s theory of transformativelearning locates the disorienting dilemma as the first part of a three-phase processtowards new perspectives and actions. The process begins with critical reflection onexisting perspectives to create new perspectives, discourse to validate critically reflec-tive insights, and then action leading to learning (Segers & De Greef, 2011).

Reflection on personal performance has long been recognised as an importantstrategy to promote high-quality, deep learning and improve practice (Brockbank &McGill, 2007; Hinett, 2002). For example, “critical incident audits” are integral tomedical practice. Medical staff analyse the significant events resulting in either bene-ficial or detrimental outcomes for patients in order to identify good or bad practice andlearn reflectively. In schools, teachers aim to capitalise on “teachable moments” –moments which emerge through unplanned events, often with a significant emotionaldimension for the student, which makes the student particularly receptive to meaningfullearning.

While research shows that reflecting on practice and making meaning are extremelyuseful approaches to learning, it also shows that people are not necessarily naturallygood at this. The conditions can, however, be created to enable transformationallearning and improved practice. Baguely and Brown’s (2009) account of nursing andeducation students swapping their written narratives about a critical incident highlight-ed the need for students to have help framing their peer feedback in ways that make ituseful for learning. Other research has focused on mistakes as a rich and strategicsource of learning when organisations provide a psychologically safe context andprocess for employees to address them (Billett, 2001; de Groot, van den Berg,Endedijk, van Beukelen & Simons, 2011; Harteis, Bauer & Gruber, 2008; Hetzner,Gartmeier, Heid & Gruber, 2011). A related, growing body of workplace learningresearch analyses the “constraining” and “expansive” aspects of workplace environ-ments in relation to managing mistakes and complexity, problem-solving, and mean-ingful learning (Ellstrom, Ekholm & Ellstrom, 2007; Engestrom, 2001; Fuller &Unwin, 2004; Lucas, Spencer & Claxton, 2012; Vaughan, O'Neil & Cameron, 2011).The new carpentry assessment system was built on an expansive model of learning thatnot only benefitted the apprentices but also the entire assessment team, and all fieldstaff and management.

Social moderation and community of practice

It was the assessment system’s moderation arrangements that provided this expansivelearning platform. Given the high-stakes nature of carpentry assessment – resulting in aqualification - the BCITO’s moderation arrangements needed to provide reassuranceabout consistency in three key areas: the evidence provided by employers; the decisionsmade by training advisors; and organisation level structures and processes, and appren-tice activities. The BCITO therefore set up a moderation approach which not onlyaddressed the three key areas but provided professional development for trainingadvisors and moderators.

The key to enabling reflection and learning came through the moderation design as“social moderation”, which harnessed an explicit acknowledgement of the social natureof learning and the subsequent dynamic processes of assessment judgements (Gipps,1994). Social moderation provided opportunity for rich professional conversations thatcan improve both individual training advisors’ practice and the performance of thesystem overall. These professional conversations occurred in two main ways: throughdirect moderator-training advisor interactions and through whole organisationmeetings.

The first, and most frequent, form of professional conversation occurred through a“ride-along” where a moderator accompanies a training advisor on a site visit. Themoderator was effectively a lead practitioner and observed the assessment activities onthese visits, providing a post-visit analysis through conversation with the trainingadvisor. Rather than only looking “backward” to evaluate consistency of judgementsalready made, the moderator and training advisor together “looked forward” to anexamination of practices, as well as outcomes.

The second, organisation-wide professional conversation occurred through the Na-tional Moderation Workshops. These brought together the national group of training

K. Vaughan et al.

advisors and moderators for professional development through group discussionsacross different regions and expertise levels. With an experienced person facilitating,each group worked through invented scenarios and exemplars of good practice,including artefacts such as training plans and knowledge evaluation guides (workbooksand portfolios used and created by apprentices). In a normal working day, trainingadvisors and moderators spend most of their time “thinking on their feet” so theworkshop performed a vital role in providing space for reflection and gave everyonean equal opportunity to participate in the various roles of discourse and build newunderstanding (Mezirow, 1995). As they discussed their experiences from the field,training advisors and moderators grew their understanding of the assessment processand their decision-making confidence.

Training advisors drew on both tacit and explicit knowledge, both as individuals andas a group. In doing so they produced a kind of “craft intimacy” around sharedproblems and sense of commonality (Wenger, McDermott & Snyder, 2002). Klenowskiand Adie use Cook and Brown’s (1999) term knowing in action to explain the way thisactivity leads to the production of new knowledge and knowing about standards inrelation to judgements in the practice of moderation (Klenowski & Adie, 2010, p. 121).This new knowledge also supported the ITO in its standards-setting and industryleadership roles. Field staff (training advisors, moderators) and management staffcontinued to learn about standards and qualifications, which then informed qualifica-tions development and revision to keep qualifications aligned with the needs ofindustry.

The overarching strategic aim of the BCITO’s moderation approach was the creationof an assessment community of practice - a social learning system that can create anideal environment for harnessing tacit knowledge by combining community, domain,and practice in relation to that shared domain of activity (Wenger et al., 2002). Whilecommunity of practice is the most widely known term, there are others which refer tovery similar concepts about reflective practice and shared knowledge construction thatmight apply here: lifelong learning networks (Koper et al., 2005), knowledge commu-nities (Paavola, Lipponen & Hakkarainen, 2002), learning networks (Poell, Chivers,Van der Krogt & Wildemeersch, 2000), and shared narrative practice (Baguley &Brown, 2009). They all have in common a sense of a broader framework for thinkingabout learning in its social, rather than solely individual, dimensions (Wenger, 2010).Although communities of practice seem best when created across and outside ofhierarchical and organisational lines, this one was based on collegiality and includedfree space for thinking together.

The social learning occurring through the moderation “ride-along” and NationalModeration Workshops benefitted not only field staff but also the apprentices. Hipkinsand Robertson’s (2011) paper on the use of social moderation to develop teachers’community of practice points cites Wenger’s (1998) social learning theory to argue thatprofessional learning for teachers can lead to richer learning opportunities and achieve-ment for students. The carpentry social moderation arangements functioned similarly.The previous carpentry assessment system had used structured worksheets as assess-ment tasks but these were not effective in discriminating between what really mattered(the intent of the standard) and what was intended more as guidance for assessors(items in a listed range statement). The rigidity was further exacerbated by theinstruction to assessors to ensure that “all questions are answered correctly”. The

instruction had sat alongside a set of model answers, which were intended to guide theassessor, but which instead became “gospel”. That sometimes led to correct answersbeing overlooked because they did not comply with the model answers, or in oralquestioning being conducted in a rigid way that failed to elicit what the apprenticeactually knew.

Although these approaches arose from good intentions to ensure the consistency ofassessment judgements, they had resulted in a lack of alignment between the assess-ment process and the learning environment. Apprentices would tend to complete thetheory work in their worksheets (e.g. installing hardware) long before they were in aposition, or sufficiently trusted, to actually do that work onsite (e.g. do chisel work on adoor). The new system was thus designed to clearly establish relationships betweensupportive learning environments and purposeful, professional assessment of learners’progress. Rather than a frogmarch through rigidly ordered standards, assessment wasdriven by the apprentice’s actual work. Social moderation through the community ofpractice could exemplify for field staff the same kinds of relationships between learningenvironment and learner progress as the assessment process itself.

Conclusion: realignment and renegotiation for learning

The community of practice played a key role in enabling reflection and learning fromthe critical incident described in this article. However it was not simply a matter of thetraining advisor and/or moderator bringing the story of the critical incident to a NationalModeration Workshop. In fact they were fairly circumspect in how they discussed theincident because it took some time to establish a reflective learning culture through theNational Moderation Workshops. In the first year or two, many training advisors notonly struggled to understand the more complex assessment knowledge and skillsrequired of them but also the sort of “rules of engagement” that would best alloweveryone to learn together. Many had initially come from a male-dominated buildingindustry culture where men “hardened up” and were loathe to express any uncertainty.It took some time to create an environment in which people could ask for help, shareknowledge constructively, and understand the distinction between criticism and criticalreflection. Indeed management at first avoided using the term “community of practice”with field staff, introducing it only later when the idea of thinking together had becomeaccepted.

As it happens, the training advisor and moderator found ways to safely introduce theelements of their experience into community discussions. They worked as a team,recounting their experience to the other training advisors and moderators in both formalmeetings and the informal ones that occur during road trips together or in spaces suchas staffrooms. Together, the training advisor and moderator modelled stepping outside abuilding industry culture that tends to constrain people's willingness to ask for help andthey drew attention to the positive power of admitting a mistake. The experience ofcarefully doing this strengthened their relationship with each other, and with othertraining advisors and moderators. It often happens that newcomers to a community ofpractice are pulled along by the community’s competence until their experience“catches up”, or that conversely a new experience and new element of practice canalso pull the community’s competence along (Wenger, 2010). In this case, as the

K. Vaughan et al.

training advisor and moderator engaged with the community, they helped establish itsculture, resulting in a realignment of relations as well as practices.

Realignments were also evident in the rest of the assessment team. In his secondinterview, the apprentice showed the researcher his work diary entry from the day of thecritical incident, five months earlier. The incident was described as “a shambles”. Theapprentice went on to reflect on the difference between the current, onsite assessmentvisit that the researcher had just observed and the previous offsite one. “When you’reonsite, you can see things better. There’s examples out here, onsite, that I can point to”,he explained. However, in addition to developing his understanding of why he found itintuitively better to be assessed onsite, he had also reconsidered the way he wasapproaching his work and learning so that he could feel more confident about express-ing his knowledge off-site. He reported being challenged to take a more active role ingathering appropriate evidence and presenting it – no mean feat for a young personwho, like most carpentry apprentices, disliked dealing with paper-based materials andwas unused to being in charge of his learning. He described having established a studypartnership arrangement with a friend, moving himself away from aimless reading tomaking meaning from the text with someone else. He had also renegotiated hisrelationship with his employer, paying more attention to “the tricks of the trade” hewas being shown and enquiring into the principles behind these.

The employer’s second interview also revealed a set of renegotiations. He had usedthe incident as a catalyst to get more guidance from the training advisor about how bestto support the apprentice. “I’m getting pointers on how to talk to him, how to explainwhat I want done, how I want it done, and why”, he said. His request for guidance fromthe training advisor had developed into a better collaboration around the assessmentteam roles themselves. The employer reported feeling that the significance of his role asthe trainer and evaluator of evidence was now better acknowledged. It opened the doorfor more discussion about how he could help collect and bear out evidence provided bythe apprentice. It made for a more explicit partnership between the training advisor,rarely onsite himself, and the employer who could be his “eyes and ears” .

The researcher’s attendance at, and observation of, several National ModerationWorkshops allowed her to continue to reflect on her role as a researcher. She came tosee herself as a “traveller” rather than “miner” of data, and a “methodological tool” inthe research process (Kvale, 1996). Her previous understanding of communities ofpractice and their levels of participation – core, active, periphery (Wenger, et al., 2002)– meant she considered herself an outsider – a non-member with an interest in thecommunity. However she came to see things as more complex. By virtue of herpresentations about the research at various events, including National ModerationWorkshops, and her participation in workshop discussion groups, she performedsomething of a coordination role, helping to connect members and build “benchesfor those on the sidelines [and making] opportunities for interaction to keep theperiphery engaged and connected” (Wenger et al., 2002, p. 57). Her researcher roleallowed her to act as a mirror to practice that all field staff could then use.

During his second interview, the training advisor reported that he was now using theresearcher’s site visit as a catalyst to spend more time with the employer (while theresearcher interviewed the apprentice). The training advisor had noticed that theapprentice was now more reflective in his answers to assessment questions, and thatthe apprentice and employer were “working smarter” together. He reported finding that

his work with the moderator since the critical incident had opened up a new space forhis own learning. He incorporated this into the interactions between his sole-chargeBCITO branch office and other larger, more urban branches. He now had even morelicense to ring other branches and “chew the fat” on assessment matters.

The training advisor has since moved into a moderator role, responsible for theprofessional development of training advisors. This has given him more space to sharehis experience of the critical incident to engage with other members of the community.He has given training advisors permission to make mistakes and learn from them. Indoing so, he has moved beyond what he initially described as a “mortifying” incident,and worked with others to renegotiate the practices and relations of assessment on-the-job. That knowledge of practice, together with the work of the researcher, has fed intothe organisation’s development of new resources for field staff, apprentices, andemployers.

The manager and designer of the assessment system has since reflected that he wasglad the critical incident had involved a training advisor who was openly acknowledgedin the organisation as a particularly proficient assessor. Because the training advisorwas held in high esteem by other training advisors, they were attentive to his experienceand interested to learn from him. Being involved in the research led the manager tointroduce the term “community of practice” to assessment staff, lending more structureto assessment arrangements and professional learning in the new system. He describesthe critical incident and the idea of community of practice as integral to each other,becoming part of the BCITO assessment “lexicon”.

We suggest that the critical incident described in this article is a significant one forthose who are designing, managing, and participating in systems for assessing on-joblearning. In rigid assessment systems concerned only with summative judgements, anddesigned around the use of model answers, the incident would probably have beenregarded as exposing a weakness in the system and in the individuals involved.However the BCITO’s system has formative (assessment for learning) purposes, aswell as summative (assessment of learning) ones. It therefore usefully reinforces theBCITO’s idea of an “assessment team”, where everyone is a learner. This is animportant capability-building focus in a sector which increasingly needs lifelonglearners who can recognise, respond to, and lead in relation to new and complexdemands in design, legislation, infrastructure development, and customer needs. Theincident is a good lesson in how professional conversation and reflective practice canbe used to help assessors (training advisors and moderators), trainers/evaluators(employers), and learners, as well as the assessment system itself, come out stronger.

The training advisor, moderator, researcher, and general manager (assessment sys-tem designer) have an ongoing relationship through another research project. Any timewe have occasion to meet, we reflect again on that incident. Much educational researchis about moving across boundaries or breaking down boundaries; boundary-crossing isoften experienced as liberating. However our experience and “shared history oflearning” in relation to the incident has produced a certain boundary between us andthose who were not directly involved (Wenger, 2010). We have not found that thisseparates us from the community (of assessment or of research). Rather it has served tounderline the special significance of one critical incident for our evolving “landscape ofpractices” (Wenger, 2010) in relation to assessment, research, professional develop-ment, collaboration, management, industry practice, and regulation. The ongoing

K. Vaughan et al.

narrative is not owned by any one of us but is generated between us. It continues to bepart of the repertoire of resources being developed by the community of practice.

Acknowledgments The authors would like to thank the apprentices, employers, and ITO staff whogenerously gave their time to share their experiences and perspectives with us.

References

Baguley, M., & Brown, A. (2009). Critical friends: an investigation of shared narrative practice betweeneducation and nursing graduates. Teaching in Higher Education, 14(2), 195–207.

Billett, S. (2001). Learning through work: Workplace affordances and individual engagement. Journal ofWorkplace Learning, 13(5), 209–214.

Brockbank, A., & McGill, I. (2007). Facilitating Reflective Learning in Higher Education (2nd ed.).Maidenhead, England Open University Press.

de Groot, E., van den Berg, B. A. M., Endedijk, M. D., van Beukelen, P., & Simons, P. R. J. (2011). Criticallyreflective work behaviour within professionals' learning communities. Vocations and Learning, 4, 41–61.

Ellstrom, E., Ekholm, B., & Ellstrom, P. (2007). Two types of learning environment: Enabling andconstraining a study of care work. Journal of Workplace Learning, 20(2), 84–97.

Engestrom, Y. (2001). Expansive learning at work: towards an activity theoretical conceptualisation. Journalof Education and Work, 14(1), 133–156.

Fuller, A., & Unwin, L. (2004). Expansive Learning Environments: integrating organisational and personaldevelopment. In H. Rainbird, A. Fuller, & A. Munro (Eds.), Workplace Learning in Context (pp. 126–144). London: Routledge.

Gipps, C. V. (1994). Beyond testing: Towards a theory of educational assessment. London: Falmer Press.Harteis, C., Bauer, W., & Gruber, H. (2008). The culture of learning from mistakes: How employees handle

mistakes in everyday work. International Journal of Educational Research, 47, 223–231.Harteis, C., Gruber, H., & Hertramph, H. (2010). How epistemic beliefs influence e-learning in daily work-

life. Educational Technology and Society, 13(3), 201–211.Hetzner, S., Gartmeier, M., Heid, H., & Gruber, H. (2011). Error Orientation and Reflection at Work.

Vocations and Learning, 4(1), 25–39.Hinett, K. (2002). Retrieved from The Higher Education Academy. http://www.heacademy.ac.uk/assets/

documents/resources/database/id485_improving_learning_part_one.pdf.Hipkins, R., & Robertson, S. (2011). Moderation and Teacher Learning. What Can Research Tell Us About

Their Relationships? Wellington: New Zealand Council for Educational Research.Klenowski, V., & Adie, L. E. (2010). Standards, teacher judgement and moderation in contexts of national

curriculum and assessment reform. Assessment Matters, 2, 107–131.Koper, R., Giesbers, B., van Rosmalen, P., Sloep, P., van Bruggen, J., Tattersall, C., et al. (2005). A Design

Model for Lifelong Learning Networks. Interactive Learning Environments, 13(1–2), 71–92.Kvale, S. (1996). InterViews: An Introduction to Qualitative Research Interviewing. California: Sage

Publications, Inc.Lucas, B., Spencer, E., & Claxton, G. (2012). How to Teach Vocational Education. A Theory of Vocational

Pedagogy. London: City & Guilds Centre for Skills Development.Mercer, J. (2007). The challenges of insider research in educational institutions: wielding a double-edged

sword resolving delicate dilemmas. Oxford Review of Education, 33(1), 1–17.Paavola, S., Lipponen, L., & Hakkarainen, K. (2002). Epistemological Foundations for CSCL: A Comparison

of Three Models of Innovative Knowledge Communities. In G. Stahl (Ed.), CSCL '02 Proceedings of theConference on Computer Support for Collaborative Learning: Foundations for a CSCL Community (pp.24–32). Germany: International Society of the Learning Sciences.

Poell, R. F., Chivers, G. E., Van der Krogt, F. J., & Wildemeersch, D. A. (2000). Learning-network Theory:Organizing the Dynamic Relationships Between Learning and Work. Management Learning, 31(1), 25–49.

Segers, M., & De Greef, M. (2011). Transformational learning: the perspective of J. Mezirow. In F. Dochy, D.Gijbels, M. Segers, & P. v. d. Bossche (Eds.), Theories of Learning for the Workplace. Building Blocks forTraining and Professional Development Programs (pp. 37–51). Abdingdon: Routledge.

Shah, S. J. A. (2004). The researcher/interviewer in intercultural context: a social intruder! British EducationalResearch Journal, 30(4), 549–575.

Vaughan, K., & Cameron, M. (2009). Assessment of Learning in the Workplace: A Background Paper.Wellington: Ako Aotearoa.

Vaughan, K., & Cameron, M. (2010a). A Guide to Good Practice in Industry Training OrganisationStructures and Systems For On-Job Assessment. Wellington: Ako Aotearoa.

Vaughan, K., & Cameron, M. (2010b). ITO Assessment Structures and Systems: Survey and Focus GroupFindings. Wellington: Ako Aotearoa.

Vaughan, K., O'Neil, P., & Cameron, M. (2011). Successful Workplace Learning; How Learning Happens atWork. Wellington: Industry Training Federation.

Vaughan, K., Gardiner, B., & Eyre, J. (2012). A Transformational System for On-Job Assessment in theBuilding and Construction Industries. Wellington: Ako Aotearoa.

Wenger, E. (2010). Communities of practice and social learning systems: the career of a concept. In C.Blackmore (Ed.), Social Learning Systems and communities of practice (pp. 179–198). London: Springer-Verlag and the Open University.

Wenger, E., McDermott, R., & Snyder, W. M. (2002). Cultivating Communities of Practice. Boston: HarvardBusiness School Press.

Karen Vaughan is a Chief Researcher at the New Zealand Council for Educational Research.

K. Vaughan et al.


Recommended