73LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
Abstract:
Résumé:
Corresponding author: Martha McGuire, Cathexis Consulting, 672 DupontStreet, #402, Toronto, ON M6G 1Z6; <[email protected]>
The Canadian Journal of Program Evaluation Vol. 20 No. 2 Pages 73–99ISSN 0834-1516 Copyright © 2005 Canadian Evaluation Society
EVALUATOR COMPETENCIES ANDPERFORMANCE DEVELOPMENT
Martha McGuireRochelle ZorziCathexis Consulting, Inc.Toronto, Ontario
Supporting professional development in evaluators can be chal-lenging because evaluators come from varied backgrounds andconduct many different types of evaluations. Evaluation com-petencies are a means of determining the developmental needsof individual evaluators, and can be used as the foundation fora comprehensive performance development system within or-ganizations that do evaluations. This article defines professionaldevelopment in the context of strategic human resource devel-opment and outlines the elements of a human resource devel-opment system, showing how evaluation competencies can beused as a basis for the system. The article gives an example ofthe system’s application, provides samples of tools that can beused for self-reflection and assessment, and outlines the ben-efits of a human resource development system.
Appuyer le perfectionnement professionnel des évaluateurs peuts’avérer difficile parce que ces derniers proviennent de milieuxdifférents et réalisent plusieurs types différents d’évaluation.Les compétences en évaluation sont une façon de déterminerles besoins en perfectionnement des évaluateurs et peuvent ser-vir de fondement à l’élaboration d’un système complet de déve-loppement du rendement au sein des organismes qui effectuentdes évaluations. Cet article définit le perfectionnement profes-sionnel dans le contexte du développement stratégique desressources humaines, décrit les éléments d’un système de déve-loppement des ressources humaines, et montre comment lescompétences en évaluation peuvent servir de base au système.L’article donne un exemple de l’application du système, fournitdes échantillons d’outils pouvant être utilisés pour l’auto-réflexion et l’évaluation, et présente les avantages d’un systèmede perfectionnement des ressources humaines.
THE CANADIAN JOURNAL OF PROGRAM EVALUATION74
Supporting professional development in evaluators canbe challenging because evaluators come from varied backgroundsand conduct many different types of evaluations. Given the tremen-dous variety of skills, knowledge, and approaches in evaluation, howdoes one assess an employee’s level of competence and/or supportongoing professional development?
Over the years, evaluation professionals have attempted to define thefield, developing lists of competencies that are important for peopleconducting evaluations. Two recent efforts have produced comprehen-sive inventories of evaluation knowledge and skills. In 2001, King,Stevahn, Ghere, and Minnema developed a preliminary taxonomy ofessential competencies. The following year, the Canadian EvaluationSociety (CES) developed a comprehensive list of evaluation knowledgeelements used for evaluation (Zorzi, McGuire, & Perrin, 2002). Al-though these two efforts do not provide a definitive list of skills andknowledge that every evaluator should have, such competency inven-tories can be used as the basis for strategic human resource develop-ment within organizations that conduct evaluations.
This article will focus on the use of the competencies as an integralcomponent of human resource development. It will provide back-ground on the evaluation competencies, discuss how these link tostrategic human resource development, provide a case example ofthe use of evaluation competencies in human resource development,and discuss the implications for practice.
BACKGROUND ON EVALUATION COMPETENCIES
Evaluation competencies are the skills, knowledge, abilities, and at-tributes required to conduct evaluation. Competencies are of inter-est to evaluation associations and educators who wish to develop acore curriculum for those who practice evaluation. They are also rel-evant to those who wish to ensure or promote quality in evaluationpractice, the logic being that individuals who possess the requisitecompetencies are more likely to produce high quality, useful evalu-ations. Consequently, clients and employers might be interested incompetencies, as might evaluation associations or others looking toprotect the reputation of the discipline.
Evaluators come from so many different backgrounds, experiences,and methodological approaches that it has been challenging to defineexactly what makes a competent evaluator. The knowledge and skills
75LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
required to conduct an evaluation vary from situation to situation,depending on the type of evaluation being conducted, the level of rig-our needed, and the specific needs of the client and otherstakeholders. For example, an evaluation of a program or policy couldbe conducted using an ethnographic approach, with in-depth qualita-tive methods and rigorous qualitative analyses, or it might insteaduse a randomized controlled experiment, with standardized measuresand complex statistical analyses. Alternatively, it could take a verypragmatic approach, with mixed methods and analysis at a relativelysuperficial level to quickly identify emergent themes for decision-making. The competencies required of the evaluator or evaluationteam would clearly be different in each of these three scenarios.
The W. K. Kellogg Foundation (1998) provides advice on what tolook for in an evaluator, depending on what the evaluation is in-tended to do.
If the evaluation purpose is to determine the worth ormerit of a program, you might look for an evaluator withmethodological expertise and experience. If the evalua-tion is focused on facilitating program improvements,you might look for someone who has a good understand-ing of the program and is reflective. If the primary goalof the evaluation is to design new programs based onwhat works, an effective evaluator would need to be astrong team player with analytical skills. Experience tellsus however that the most important overall characteris-tics to look for in an evaluator are the ability to remainflexible and to problem-solve. (pp. 59–60)
Most evaluation associations have guidelines or standards thatreflect the situational nature of evaluation competencies. For ex-ample, the Canadian Evaluation Society’s Guidelines for Ethical Con-duct (n.d.) state:
Evaluators are to be competent in their provision ofservice.1.1 Evaluators should apply systematic methods of in-
quiry appropriate to the evaluation.1.2 Evaluators should possess or provide content knowl-
edge appropriate for the evaluation.1.3 Evaluators should continuously strive to improve
their methodological and practice skills.
THE CANADIAN JOURNAL OF PROGRAM EVALUATION76
Likewise, the American Evaluation Association’s Guiding Principlesfor Evaluators (AEA, 2004) require evaluators to “provide compe-tent performance to stakeholders,” specifically noting that evalua-tors must possess the requisite education, abilities, skills, andexperience to undertake the evaluation; demonstrate cultural com-petence; practice within the limits of their competence; and main-tain and improve their competencies.
Both the CES and the AEA guidelines note the need for continuousimprovement in competencies. This is especially important giventhe evolving nature of evaluation. As noted by McLean (2000),
Case studies, performance indicators, logic models, high-tech measurement, critical theory — none of these werediscussed widely, if at all, even 20 years ago. The theoryand practice of program evaluation are both rich andvaried in ways no one predicted, as the annual confer-ences of the CES and AEA attest. What we can safelypredict is that they will continue to evolve and grow inexciting ways. (p. 189)
Despite the variety and evolution of evaluations, competency inven-tories have been developed and there appears to be a high degree ofagreement on some basic elements. King et al. (2001) developed suchan inventory based on their exploratory study on the extent to whichevaluation professionals could reach agreement on essential evalu-ation competencies. They concluded that there may be more agree-ment on the competencies needed by evaluators than initiallyanticipated, based on finding a 78% agreement on the competenciesin their taxonomy. They also concluded that the areas where con-sensus did not emerge reflected the role- and context-specific na-ture of evaluation practice, thus supporting the notion that theknowledge depends on the purpose and context of the evaluation.Their table of essential evaluator competencies is comprehensiveand shows areas of agreement and disagreement.
A second study carried out for the CES (Zorzi et al., 2002) resultedin a comprehensive list of evaluation competencies. This study in-volved consultation via the Internet with evaluation practitionersboth within and outside of Canada. Practitioners were asked to con-sider a specific evaluation in which they had participated in the past,and to identify the competencies they needed to complete it so thatit resulted in benefits for the program being evaluated. The results
77LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
of the survey were interpreted with the assistance of an interna-tional reference panel of 36 evaluation experts with diverse back-grounds.
The CES study identified 23 general knowledge elements, withinwhich more specific knowledge, skills, and practices were identified.Reference materials were identified for each specific item. The au-thors were unable to identify a list of core competencies that everyevaluator should have, and they noted that it was neither possiblenor desirable for any one person to be competent in all areas. In-stead, they emphasized that evaluators need to be (a) aware of thedifferent methods and approaches, (b) able to realistically assesstheir own capabilities, and (c) able to assemble teams of people withthe knowledge and skills needed for a specific evaluation.
In sum, the competencies needed to conduct an evaluation vary de-pending on the purpose and context of the evaluation, evaluationcompetencies are constantly evolving, and no individual need be com-petent in all areas.
Taking all of this into account, it is clearly not easy to determine asingle set of competencies that all evaluators must possess. Thisposes a challenge for organizations that employ evaluators and wantto manage and further develop the performance of their evaluators.These organizations may find it difficult to support performancedevelopment without a well-defined set of required competencies.However, as will be demonstrated shortly, competency inventoriescan be used as the basis for performance development within or-ganizations that conduct evaluations.
BACKGROUND ON HUMAN RESOURCE DEVELOPMENT (HRD)
The workplace is an important context for ongoing professional de-velopment, and is the primary location for human resource develop-ment (Bierema & Eraut, 2004). Human resource development linksprofessional development with organizational expectations and theprovision of systematic assessment and learning experiences in or-der to bring about performance improvement and professionalgrowth, thus providing a context for the application of evaluationcompetencies.
“Organizations work the way they do because of the way people workin those organizations” (Senge, 1996, p. 19). Recently, increased at-
THE CANADIAN JOURNAL OF PROGRAM EVALUATION78
tention has been given to strategic human resource developmentthat includes the integration of training and development into widerbusiness planning. Garavan (1991) points to the need to place con-tinuous knowledge development in the context of the external envi-ronment. Human resource development plans and policies need tobe linked to the organization’s business plan, with monitoring ofthe external environment (Garavan, 1991; McCracken & Wallace,2000). The mission and goals of the specific organization combinewith professional competencies to define the performance expecta-tions of a practicing evaluator.
The human resource development system establishes the means bywhich the organization supports individuals so that they are able toperform their current job functions, and creates an environment forgrowth. Increasingly, attention has been paid to “learning organi-zations” that create a workplace where people can continuously ex-pand their capacities. Some of the elements of a learning organizationinclude application of systems management methods, the develop-ment of openness and trust, a tolerance for error, finding new waysof reframing and thinking through issues and problems, and em-bedded self-reflection (Ellinger, 2004; Smith, 2004). The concept ofcritical self-reflection has emerged from adult learning theory. VanWoerkom (2004) points out that human resource development prac-tices should not only play a role in the development of competen-cies, but should also support critically reflective work behaviour.
In order to be effective, a human resource development system mustmeet the needs of the organization, meet the needs and wishes ofthe individual, and be consistent with the expectations of the pro-fession.
ELEMENTS OF A STRATEGIC HUMAN RESOURCE DEVELOPMENTSYSTEM
A human resource development system is often established in con-sultation with those who are most affected by it. It includes the fol-lowing elements: a conceptual framework, articulation of theexpectations of each position within the organization, support forself-reflection, ongoing feedback, a formal assessment, and an indi-vidual learning and development plan that is supported by the or-ganization.
79LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
Conceptual Framework
A human resource development conceptual framework provides thefoundation for the performance appraisal process by identifying themission, values, and goals of the organization as well as those skills,behaviours, and the knowledge that are fundamental to the organi-zation and the work that it carries out. For evaluation positionswithin an organization, the evaluation competencies would be com-bined with attributes, behaviours, and knowledge that are specificto the organization. It would include the core values that each indi-vidual within the organization is expected to demonstrate, the skillsand knowledge essential to support the organization’s mission andgoals, and an indication of how professional competencies are to beintegrated into the human resource development system. For ex-ample, a hospital may have stated values such as individual-focusedcare, pursuit of excellence, and evidence-based practice. The primarygoal may be provision of emergency, in-patient, and out-patienthealth care services to a particular population. All individuals withinthe hospital would be expected to embrace the stated values andsupport the overall goal, within the context of their own professionalpractice. The conceptual framework establishes the foundation foraddressing organizational expectations and at the same time en-suring that professional requirements are supported.
Position Description
A position description concretely defines the expectations of a par-ticular job within the context of the human resource developmentconceptual framework. It outlines the responsibilities, accountabil-ities, and qualifications of the particular position, referencing theprofessional requirements associated with the position. It is used asa primary tool for defining job expectations.
Responsibilities can include professional activities to be carried outby the individual, as well as responsibilities related to the function-ing of the organization.
The qualifications required should be consistent with the responsi-bilities of the position. For example, the position of junior researchermight be responsible for conducting literature reviews, conductingsurveys, entering data, and assisting with quantitative analysis. Theperformance expectations of a junior researcher would therefore besubstantially different than those of a project manager, who needs
THE CANADIAN JOURNAL OF PROGRAM EVALUATION80
to be highly skilled in project management and have advanced skillsin most other evaluation areas. An inventory of evaluation compe-tencies can be helpful for determining the professional qualificationsthat are appropriate for a particular evaluator position.
Self-reflection
Self-reflection links to conceptual framework, position description,and personal aspirations. It is an ongoing process in which a staffmember reflects on his/her knowledge, skills, and practices, and iden-tifies personal strengths as well as areas that require improvement.It can occur informally through discussions around the compositionof a team for a particular assignment and the role of each evaluatorin that assignment. It can also occur as an evaluator carries out anassignment and finds there are gaps in his/her knowledge or skills.During the performance development process, it may also be appro-priate to conduct a formal self-assessment of knowledge and skillsthat are relevant to one’s position.
Ongoing Feedback
Ongoing feedback contributes to the self-reflection process and helpsto create a learning environment. It can be provided informallythrough discussions around building a team for an assignment orproviding support in carrying out an assignment. The 360° approachis useful in this informal process; that is, feedback should come fromsuperiors, subordinates, peers, clients, and anyone else who is in-volved in or impacted by the particular individual’s work. Any mem-ber of the organization can be approached and asked for guidance intheir area of expertise. Any member of the organization can providesuggestions. This provides opportunity for continuous feedback froma number of sources, creates the opportunity for continuous learn-ing, and supports a learning environment within an organization.
Formal Assessment
The formal assessment is what most people think about when theyhear the term human resource development. It brings together theongoing feedback and self-reflection. It can include a 360° processwhere formal feedback is received from a range of people who wouldbe aware of or impacted by the individual’s performance. Minimally,both the manager and the individual should spend time reflecting
81LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
on the individual’s performance. The process generally involves thefollowing steps:
1. Preparation through the completion of a performance ap-praisal form by the individual and the manager, as well asothers who have knowledge and could provide constructivefeedback. The performance appraisal form should be de-signed to support critical self-reflection by the individual.The performance appraisal form should include those com-petencies that are key to the individual’s position, includ-ing both evaluation and non-evaluation competencies.
2. A performance appraisal interview, in which the employeeand his/her supervisor discuss the employee’s strengths, aswell as areas that could be improved. Even for a highly com-petent employee, this can be a very stressful meeting. Justas with evaluation feedback, beginning with (and focusingon) the positive aspects of an individual’s performance helpsthe individual accept feedback about areas that require de-velopment. The feedback is more likely to be acted upon ifit is placed in the context of the individual’s own goals, bothpersonal and professional. It is important for the managerto provide an atmosphere in which the individual feels freeto discuss goals and aspirations. This sets a positive con-text for discussing areas of future development. If there areserious performance issues, those need to be addressed inthe most constructive way possible.
Personal Learning Development Planning
Personal learning development planning is a critical element of alearning environment. If it is to be useful, it must be part of theformal assessment process and continue throughout the year. In thispart of the process, the employee and his/her supervisor make con-crete plans for actions that will increase the levels of competence inspecific areas. For example, once an evaluator has demonstratedcompetence in planning and conducting various types of evaluations,that person may wish to gain some project management experience.This knowledge can be gained through courses, workshops,mentoring, and/or experiential learning. Often, a combination oflearning opportunities is used. The manager and the individual musteither plan a specific opportunity or seek opportunities to gain theskill. The development of an individual rests not only with that per-son, but also with the organization.
THE CANADIAN JOURNAL OF PROGRAM EVALUATION82
It is important to revisit the personal learning development planthroughout the year to assess progress and make needed adjust-ments. The personal learning development plan shown in AppendixB includes timelines for achievement of learning goals, as well asindicators of achievement, which facilitate ongoing self-reflection andperformance appraisal.
CASE EXAMPLE: USING EVALUATION COMPETENCIES FORHUMAN RESOURCE DEVELOPMENT IN A SMALL CONSULTINGFIRM
In this section, we describe how the staff of a small consulting firmundertook to establish a human resource development system thatwould provide them with feedback and support for ongoing profes-sional development in evaluation. The firm was approximately threeyears old and had four staff members at the time the system was de-veloped. All staff members of the firm were involved in strategic-leveldiscussions about the process and in reviewing the tools to be used.
Conceptual Framework
As part of the organizational development process, the goals of thefirm were articulated first through informal discussions among staffand then formalized into the following mission and value statement:
Our mission is to contribute to the improvement of pro-grams, organizations, systems, and society as a wholeby conducting high quality evaluations and reviewsthrough sound methodology, using appreciative, respect-ful, and transformative processes. The underlying valuesin carrying out our work include creativity, innovation,excellence, honesty, integrity, timeliness, rigour, and fun.
The human resource development conceptual framework integratedthe mission and value statement with the core competencies out-lined in the CES core body of knowledge work (Zorzi et al., 2002).One of the challenges was determining expectations regarding lev-els of competencies. Ranges of expectation were agreed upon throughlengthy discussion, with the understanding that each individualwould bring a different mixture of attributes and competencies, andthat together the team would provide the full range of competenciesrequired for the organization to carry out its mission. In this way,
83LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
the human resource development processes contributed directly tothe organizational development efforts and at the same time sup-ported professional expectations.
Position Description
When the system was developed, the firm had a formal written po-sition description only for the research associate position. Job ex-pectations for junior- and senior-level consultants had not beenexplicitly stated, although the responsibilities of these positions weretacitly understood.
To define the professional qualifications required for each position,the staff members initially used the CES’s list of evaluator compe-tencies (Zorzi et al., 2002). They developed an assessment tool thatlisted each competency, and collectively rated the minimum requiredlevel of knowledge or skill for each position in the firm.
Minimum levels for evaluation knowledge were rated on the follow-ing scale:
0 = Not present (i.e., not required for the position)1 = Basic (awareness of the idea, knowledge of some of the ba-
sic concepts)2 = Intermediate (foundational knowledge, able to apply the
knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowl-
edge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, un-
derstands complexities in application)
Minimum competence levels for evaluation skills were set accord-ing to this scale:
0 = Not present (i.e., not required for the position)1 = Basic (can perform with structured guidance)2 = Intermediate (can perform with minimal guidance)3 = Advanced (can perform independently)4 = Expert (can perform with finesse and adapt to new situa-
tions)
Because the list of competencies was quite long and technical (thefull list of competencies is shown in a similar tool in Appendix A),
THE CANADIAN JOURNAL OF PROGRAM EVALUATION84
the process was time-consuming and not all staff members were ableto complete it. In the end, it was necessary for senior-level staff toreduce the list of competencies to those directly related to the jobresponsibilities and to assign minimum competence levels for these.The full list was useful primarily as a guide for experienced evalua-tors to reference in selecting applicable qualifications.
As an aside, using the complete version of the tool was useful for iden-tifying areas of specialty that were important for the firm, but thatwould not necessarily be required of every staff member in a givenposition. For example, the firm determined that they would like tohave at least one staff member who has expert-level knowledge aboutvarious types of sampling and measurement. While this would not bea minimum requirement for a given position, the firm selected staffmembers with aptitude in this area and supported their professionaldevelopment so that they would develop such expertise.
In addition to professional competencies, each position required othercompetencies that related to corporate citizenship. For example, com-petencies related to teamwork and communication were included inthe position descriptions.
Self-reflection
The self-assessment tool in Appendix A, which is based on the CESlist of evaluator competencies (Zorzi et al., 2002), was made avail-able to staff for self-reflection. Some staff chose not to use this tool,preferring a more intuitive process. Others found the tool very use-ful for identifying areas in which they would like to further developtheir knowledge or skills. Knowing the required minimum compe-tency level for each knowledge element or skill made the self-as-sessment tool less onerous, because staff members could choose toignore those competencies that were not required for their position.
Ongoing Feedback
In addition to regular feedback from supervisors and project man-agers, the firm supported informal ongoing feedback with eachproject. “Post-mortem” meetings were held to analyze what workedwell and what did not for a given project, particularly when signifi-cant challenges had been encountered. In these meetings, all staffmembers had an opportunity to reflect critically on the project and
85LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
provide constructive feedback to their colleagues. The success of thismethod is rooted in the open and collegial atmosphere in the firm,where constructive criticism is encouraged on a day-to-day basis.
Formal Assessment and Personal Learning Development Planning
Formal performance appraisals were conducted for each staff mem-ber, using the performance appraisal form shown in Appendix B.This form was based on the key elements of the position descrip-tions. (Because all staff members in this organization conducted thesame type of work, the firm was able to use a single performanceappraisal form for all staff, modifying only the level of expectation.In organizations where job descriptions are significantly different,multiple forms would be required.) Both the staff member and his/her supervisor(s) completed the form. They then met to discuss theform, as well as the self-assessment the staff member had previ-ously completed. Staff members were encouraged to identify theirown aspirations, strengths, and areas requiring development. Thesupervisor also identified areas where they wanted to see profes-sional growth.
Through this process, staff members and their supervisor identifiedboth long-term and short-term learning goals, and created a learningplan for the next year, using the template in Appendix B. The firmallocated resources for appropriate professional development activi-ties (e.g., conferences, workshops, journals, textbooks, mentors).
The personal learning plan provides an additional element for re-flection by both the individual and the organization in the followingyear’s performance appraisal process. The individual can reflect onthe extent to which he/she carried out the plan. The organizationmust consider the extent to which it supported the individual incarrying out the plan.
Reflections on the Process
A human resource development system works only to the extent thatit is used and supported. Often the work of the organization is givenpriority over human resource development. In this case, the impe-tus for human resource development was a priority for all staff. In-dividuals took the task of self-reflection seriously and determinedareas where development was needed. Staff were eager for formal-
THE CANADIAN JOURNAL OF PROGRAM EVALUATION86
ized feedback and ensured that the process occurred. A personal de-velopment plan was created for each individual with initial think-ing on how that plan could be implemented. The organizationsupported the plans through creating experiential opportunities suchas managing a project for the first time, allowing work time forcourses, and paying workshop fees.
The primary gap in the current system is evaluation of the head ofthe organization. In a non-profit organization, this would be carriedout by the board of directors. In this case, the staff need to engagein the process with the support of the head so that the inherentpower differentials do not undermine the ability to enter into a criti-cally constructive process.
DISCUSSION
Ongoing professional development is essential for evaluation prac-titioners to maintain their knowledge and skills in an evolving field.The challenge for organizations that employ evaluators is to deter-mine how best to assess learning needs and to support professionaldevelopment when there is such a wide range of competencies forevaluators.
In our case example, a comprehensive list of evaluation competen-cies (Zorzi et al., 2002) was used in the creation of a human resourcedevelopment system for a small consulting firm. The success of thisprocess required some key elements in the organization:
1. The firm already had a very open atmosphere that was con-ducive to informal feedback and constructive criticism. Staffmembers did not feel threatened when considering their ownareas in need of improvement, because this was seen withinthe organization as a learning opportunity, not one thatwould put their jobs in jeopardy. Had there been an atmos-phere where frank consideration of professional strengthsand weaknesses was more threatening, or where feedbackwas destructive rather than constructive, the system wouldnot have been as successful.
2. The staff in the firm reviewed the organization’s mission,purpose, and values with an eye to identifying fundamen-tal “corporate citizenship” competencies that were impor-tant for all staff. Without this link to the overall goals ofthe organization, the human resource development system
87LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
would have been disconnected from the day-to-day realityof the organization.
3. The firm had experienced evaluators on staff who were ableto pick out the core competencies needed, and to determinethe level of competency needed, for each of the positions.Without the evaluation knowledge provided by these sen-ior staff members, it would have been very difficult to de-termine which competencies were important. There is adanger that organizations without the capability of assess-ing the relative importance of the different competencieswill attempt to assess all of the competencies, which wouldbe overwhelming and result in their seeking a “super evalu-ator” who possesses each and every one of the competen-cies. As previously noted, it is not realistic for a singleevaluator to be competent in all areas.
4. The firm had a genuine interest in supporting staff mem-bers’ professional development, and was willing to allocateresources to make it happen.
Future Directions
To help organizations develop human resource development systemsfor evaluators, we need more advanced tools that make it easier todetermine which competencies are relevant and important for spe-cific staff members. When the organization has experienced evalua-tors on staff, they can use the self-assessment tool presented inAppendix A as a reference, along with a description of the responsi-bilities in the position, to make these assessments. However, thisprocess is time-consuming and would be difficult or impossible for asupervisor with little evaluation knowledge. It would be useful todevelop a tool that would help organizations analyze the types ofevaluations they need, and thereby define the qualities needed bytheir evaluators.
Other Possible Uses of Evaluation Competencies
We believe that evaluation competencies would also be useful forother aspects of human resource development, as follows:
• Hiring – by articulating the evaluation competencies neededby a new staff member, it becomes easier to write job de-scriptions, post job openings, and develop appropriate as-sessment criteria and interview questions.
THE CANADIAN JOURNAL OF PROGRAM EVALUATION88
• Identifying organizational competencies – taking the posi-tion development idea to a macro level, it is possible to iden-tify what competencies are required by the organization,and determine whether or not the staff (as a team) do in-deed have the strengths required for the types of evalua-tions that they carry out. If those competencies are notstrong in terms of the overall team, it is possible to con-sider hiring someone with the skills that are missing, orencouraging an existing team member to develop the skills.
• Assembling teams for specific projects – identifying specificproject needs, and selecting team members with the requiredcompetencies. If the team does not have all of the requiredskills, it is often possible to look externally in order to fillthe gaps. In some cases, a decision not to take on a projectwill be made if the competencies to do a high quality evalu-ation are not available.
• Helping each other learn – identifying expertise in teammembers and using each other as resources for learning.
REFERENCES
American Evaluation Association. (2004, July). Guiding principles for evalu-ators. Retrieved June 20, 2005 from <http://www.eval.org/Guiding%20Principles.htm>.
Bierema, L., & Eraut, M. (2004). Workplace-focused learning: Perspectiveon continuing professional education and human resource develop-ment. Advances in Developing Human Resources, 6(1), 52–68.
Canadian Evaluation Society. (n.d.). Guidelines for ethical conduct. Re-trieved June 20, 2005 from <http://www.evaluationcanada.ca/site.cgi?s=5&ss=4&_lang=an>.
Ellinger, A.D. (2004). The concept of self-directed learning and its implica-tions for human resource development. Advances in DevelopingHuman Resources, 6(2), 158–174.
Garavan, T. (1991). Strategic human resource development. Journal ofEuropean Industrial Training, 15(1), 17–30.
King, J.A., Stevahn, L., Ghere, G., & Minnema, J. (2001). Toward a tax-onomy of essential evaluator competencies. American Journal ofEvaluation, 22(2), 229–248.
89LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
McCracken, M., & Wallace, M. (2000). Towards a redefinition of strategicHRD. Journal of European Industrial Training, 24(5), 281–290.
McLean, L. (2000). Reflections on program evaluation, 35 years on. Cana-dian Journal of Program Evaluation, Special Issue, 185–190.
Senge, P.M. (1996). The ecology of leadership. Leader to Leader (formerlythe Drucker Institute), 2, 18–23.
Smith, I. (2004). Continuing professional development and workplace learn-ing: HRD and organizational learning. Library Management, 25(1/2), 64–66.
Van Woerkom, M. (2004). The concept of critical reflection and its implica-tions for human resource development. Advances in DevelopingHuman Resources, 6(2), 178–193.
W.K. Kellogg Foundation. (1998). Evaluation handbook. Retrieved June20, 2005 from <http://www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf>.
Zorzi, R., McGuire, M., & Perrin, B. (2002). Canadian Evaluation Societyproject in support of advocacy and professional development: Evalu-ation benefits, outputs, and knowledge elements. Retrieved June 20,2005 from <http://consultation.evaluationcanada.ca/pdf/ZorziCESReport.pdf>.
Martha McGuire has almost 20 years’ experience in conductingevaluation, research, and organizational reviews. Throughout hercareer, she has helped many organizations improve their organiza-tional processes and human resource systems. Martha is an activemember of the Canadian Evaluation Society – Ontario Chapter andis currently serving on the National Council.
Rochelle Zorzi has been conducting program evaluation and ap-plied research studies since 1995. She was project lead for the Ca-nadian Evaluation Society’s recent research into the benefits,outputs, and knowledge requirements for evaluation. Rochelle is anactive member of the Canadian Evaluation Society – OntarioChapter.
THE CANADIAN JOURNAL OF PROGRAM EVALUATION90
Appendix ASelf-Assessment Tool
KNOWLEDGE
Ethics and QA Competency LevelCompetency Self Rating Required Learning GoalKnowledge and application of ethical guidelinesFreedom of information and protection of privacyAwareness of the steps in conducting an evaluationAwareness of risks to the integrity of the evaluation
processApplication of standards for evaluationMeta-evaluation
Systems Theory Competency LevelCompetency Self Rating Required Learning GoalOrganizational development and changeKnowledge managementEvaluation’s role in organizational development
and changeEvaluation uses (e.g., formative, summative)Understanding of how decisions are made in
a political contextSystems approaches, systems thinkingChaos and complexity theories
Specific Types of Evaluation Competency LevelCompetency Self Rating Required Learning GoalNeeds assessmentEvaluability assessmentProcess evaluation/implementation evaluationOutcome evaluation/impact assessmentEfficiency evaluation/cost analysis
Knowledge Competency Levels0 = Not present1 = Basic (awareness of the idea, knowledge of some of the basic concepts)2 = Intermediate (foundational knowledge, able to apply the knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowledge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, understands complexities in application)
91LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
History, Theory, Models Competency LevelCompetency Self Rating Required Learning GoalHistory of evaluationEvaluation paradigms (e.g., positivism, constructivism,
collaborative interpretation, hermeneutics)Human construction of meaningUtilization-focused evaluationEmpowerment evaluationParticipatory evaluationGoal-free evaluationRealistic evaluation
Research Design Competency LevelCompetency Self Rating Required Learning GoalExperimental, quasi-experimental, non-experimentalLongitudinalCase studyEthnographyNaturalistic inquiryPhenomenology and epistemologyProgram reviewSurvey researchMixed methodRuling out alternative interpretations
Sampling and Measurement Competency LevelCompetency Self Rating Required Learning GoalProbability samplingPurposeful samplingReliabilityValidityPsychometric theory, including factor analysis
Knowledge Competency Levels0 = Not present1 = Basic (awareness of the idea, knowledge of some of the basic concepts)2 = Intermediate (foundational knowledge, able to apply the knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowledge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, understands complexities in application)
THE CANADIAN JOURNAL OF PROGRAM EVALUATION92
Capacity Building Competency LevelCompetency Self Rating Required Learning GoalAdult education principles and techniques
SKILLS AND BEHAVIOURS
Ethical Conduct & Competence Competency LevelCompetency Self Rating Required Learning GoalRespect the human dignity and worth of the people
involvedSensitivity to cultural and social environment of the
program and its stakeholdersEnsure honesty and integrity of the evaluationAct in the best interest of the program stakeholders
and the general publicDisclose biases, conflicts of interest, and methodological
limitationsSelf-assessment of competency to perform the evaluation
(knowing one’s own limits)Ongoing improvement of skills, knowledge, networks
Groundwork Competency LevelCompetency Self Rating Required Learning GoalBecome familiar with the programAnalyse the social, political, and cultural context of the
programDevelop a program descriptionDevelop a logic modelDetermine if it is appropriate to evaluate the programBe clear who is the clientIdentify stakeholdersIdentify the goals and values of stakeholdersObtain cooperation of stakeholder groupsIdentify program objectives
Knowledge Competency Levels0 = Not present1 = Basic (awareness of the idea, knowledge of some of the basic concepts)2 = Intermediate (foundational knowledge, able to apply the knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowledge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, understands complexities in application)
93LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
Identify information needsSpecify evaluation questions
Evaluation Planning Competency LevelCompetency Self Rating Required Learning GoalSelecting appropriate data collection and analysis
methodsAdapt the evaluation to situational needs/constraintsAttend to cross-cultural, age, or gender issuesDesign the evaluation so as to minimize intrusivenessIncorporate triangulation, multiple methods, multiple
perspectives, and multiple lines of evidenceIncorporate consultation and stakeholder involvement
as appropriateSelect appropriate sampling methodsAdapt/change the study as needed
Data Collection Competency LevelCompetency Self Rating Required Learning GoalLiterature reviewProgram records, documentsDevelopment of performance measurement systemsQuestionnairesInterviewsFocus groupsObservationParticipant observationGroup concept development, brainstormingTown hall meetings and other group processesExperiential methods (games, classroom activities)Projective techniques, psychological testsNarrative inquiry, logs, journals oral historiesUsing physical evidenceUnobtrusive evidence
Knowledge Competency Levels0 = Not present1 = Basic (awareness of the idea, knowledge of some of the basic concepts)2 = Intermediate (foundational knowledge, able to apply the knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowledge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, understands complexities in application)
THE CANADIAN JOURNAL OF PROGRAM EVALUATION94
Data Analysis Competency LevelCompetency Self Rating Required Learning GoalNarrative reviewContent analysis, quantifying qualitative dataIdentifying and verifying emergent themesGrounded theoryFlow diagramsDatabase construction and manipulationHandling missing dataDescriptive statistics (frequencies, means)Multiple regression or analysis of varianceMeta-analysisTrend analysisStructural equation modellingCost-effectiveness analysis, case costing, etc.Development of regular analysis and reporting
systems for performance measuresGradingRankingSetting criteriaMaking judgements
Critical Thinking Competency LevelCompetency Self Rating Required Learning GoalAnalysisSynthesisProblem-solvingConceptual thinkingBe open to unintended impacts and effectsRemain neutralBe willing to question the systemBe curious, inquisitiveThink outside the box
Knowledge Competency Levels0 = Not present1 = Basic (awareness of the idea, knowledge of some of the basic concepts)2 = Intermediate (foundational knowledge, able to apply the knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowledge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, understands complexities in application)
95LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
Draw conclusionsMake recommendations
Reporting Competency LevelCompetency Self Rating Required Learning GoalPresentationsReport writingPreparation of cabinet documents and presentationsGraphical displaysMedia communicationsPresenting negative/lukewarm evaluation results
constructivelyRegular and timely communicationsDeveloping a communication strategy
Communication and Interpersonal Competency LevelCompetency Self Rating Required Learning GoalFacilitationNegotiationDiplomacyGroup processingCollaboration, be a team playerMotivating othersConflict resolution, dealing with antagonistic peoplePolitical astuteness and perceptivenessAble to work within a multicultural environmentQuestioningActive listeningSensitivityProbing, obtaining clarificationProviding constructive feedback in a tactful wayAble to communicate in both English and French
Knowledge Competency Levels0 = Not present1 = Basic (awareness of the idea, knowledge of some of the basic concepts)2 = Intermediate (foundational knowledge, able to apply the knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowledge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, understands complexities in application)
THE CANADIAN JOURNAL OF PROGRAM EVALUATION96
Project Management Competency LevelCompetency Self Rating Required Learning GoalBe clear who is the clientFiscal responsibility, budgetingScheduling, time managementRisk managementAssembling an evaluation teamMaking use of outside expertiseManaging/supervising othersWriting proposalsAccessing needed resources (personnel,
information, instruments, funding)Organizing resources, maximizing use of
available resources, doing evaluation on a shoestringWriting formal agreementsComputer skillsGood documentation practicesSystematically reviewing data, analyses, and
reports for accuracy/quality
Knowledge Competency Levels0 = Not present1 = Basic (awareness of the idea, knowledge of some of the basic concepts)2 = Intermediate (foundational knowledge, able to apply the knowledge appropriately)3 = Advanced (thorough knowledge, able to apply the knowledge appropriately)4 = Expert (sophisticated theoretical and applied knowledge, understands complexities in application)
97LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
Appe
ndix
BSa
mpl
e Pe
rfor
man
ce A
ppra
isal
For
m
Nam
e:__
____
____
____
____
____
____
____
____
____
_Da
te:_
____
____
____
____
____
____
____
____
___
Posi
t ion:
____
____
____
____
____
____
____
____
____
_Co
mpl
eted
by:
____
____
____
____
____
____
____
__
Leve
l of P
ract
ice:
__
_Ju
nior
Con
sul ta
nt/R
esea
rche
r __
_Se
nior
Con
sul ta
nt
___
Prin
cipa
l
Eval
uati
on P
ract
ice:
Eth
ics
and
Qual
ity
Assu
ranc
eAl
ways
Som
etim
esDo
es N
otDo
esEx
ceed
sM
eets
Mee
tsM
eet
Not
Com
men
ts/
Elem
ents
Expe
ctat
ions
Expe
ctat
ion
Expe
ctat
ion
Expe
ctat
ions
Appl
yNo
tabl
e Ob
serv
atio
nsKn
owle
dge
and
appl
icat
ion
of e
thic
al g
uide
lines
Know
ledg
e of
free
dom
of i
nfor
mat
ion
and
prot
ectio
n of
priv
acy
Awar
enes
s of
the
step
s in
con
duct
ing
an e
valu
atio
nRe
spec
t the
hum
an d
igni
ty a
nd w
orth
of p
eopl
e in
volv
edSe
nsiti
vity
to c
ultu
ral a
nd s
ocia
l env
ironm
ent o
f the
pro
gram
and
its s
take
hold
ers
Self-
asse
ssm
ent o
f com
pete
ncy
to p
erfo
rm th
e ev
alua
tion
Disc
lose
s bi
ases
, con
flict
s of
inte
rest
and
met
hodo
logi
cal
limita
tions
Ongo
ing
impr
ovem
ent o
f ski
lls, k
nowl
edge
, net
work
s
Eval
uati
on S
kills
Alwa
ysSo
met
imes
Does
Not
Does
Exce
eds
Mee
tsM
eets
Mee
tNo
tCo
mm
ents
/El
emen
tsEx
pect
atio
nsEx
pect
atio
nEx
pect
atio
nEx
pect
atio
nsAp
ply
Nota
ble
Obse
rvat
ions
Eval
uatio
n Pl
anni
ngPr
ojec
t Man
agem
ent
•Cl
ient
man
agem
ent
•Ti
me
man
agem
ent
•Fi
scal
man
agem
ent
THE CANADIAN JOURNAL OF PROGRAM EVALUATION98Ev
alua
tion
Ski
lls (c
ont.
)Al
ways
Som
etim
esDo
es N
otDo
esEx
ceed
sM
eets
Mee
tsM
eet
Not
Com
men
ts/
Elem
ents
Expe
ctat
ions
Expe
ctat
ion
Expe
ctat
ion
Expe
ctat
ions
Appl
yNo
tabl
e Ob
serv
atio
ns•
Qual i
ty co
ntro
l•
Man
agin
g/su
perv
ising
oth
ers
Data
col le
ct ion
:•
Inter
views
•Fo
cus g
roup
s•
Docu
men
t rev
iew•
Surv
eys
•Li
teratu
re re
view
•Ob
serv
at ion
•Ex
perie
ntial
meth
ods
•Na
rrativ
e inq
uiry
Data
analy
sis:
•Co
nten
t ana
lysis
•Gr
ound
ed th
eory
•Fl
ow d
iagra
ms
•Da
tabas
e con
struc
tion
and
man
ipul
ation
•St
atisti
cal a
nalys
is•
Inter
preti
ng d
ataCr
itica
l thi
nkin
g in
the a
reas
of a
nalys
is, sy
nthe
sis,
prob
lem-s
olvin
gRe
porti
ng:
•Pr
esen
tatio
ns•
Repo
rt wr
iting
•Gr
aphi
cal d
isplay
s•
Gene
ral c
omm
unica
tion
99LA REVUE CANADIENNE D'ÉVALUATION DE PROGRAMME
Corp
orat
e Ci
tize
nshi
pAl
ways
Som
etim
esDo
es N
otDo
esEx
ceed
sM
eets
Mee
tsM
eet
Not
Com
men
ts/
Elem
ents
Expe
ctat
ions
Expe
ctat
ion
Expe
ctat
ion
Expe
ctat
ions
Appl
yNo
tabl
e Ob
serv
atio
nsW
orks
effec
t ively
as a
team
mem
ber
Assu
mes
per
sona
l res
pons
ibi l i
ty fo
r tea
m’s
ef fec
t iven
ess
•M
akes
cons
truct i
ve su
gges
t ions
•Av
oids
plac
ing
blam
e•
Mod
els b
ehav
iour
s tha
t con
tribu
te to
the w
el far
e of t
he te
amCo
mm
unica
tes re
spec
t ful ly
wi th
col le
ague
s. Pr
ovid
es fe
edba
ckin
a re
spec
t ful m
anne
r.Un
ders
tands
and
appl
ies th
e bus
ines
s elem
ents
of co
nsul
ting
Pers
onal
Lea
rnin
g De
velo
pmen
t Pla
nCa
reer
Goa
ls:
Wha
t do
you
want
Wha
t do
you
need
ByIn
dica
tor t
hat l
earn
ing
orDa
te ob
jectiv
esto
lear
n or
chan
ge?
to d
o?wh
en?
chan
ge h
as o
ccur
red
are m
et
Date
Prog
ress
Inter
vent
ion
Date
Acco
mpl
ishm
ent o
f Goa
l
Perfo
rman
ce R
eview
Mee
ting
Date:
Sign
ature
of e
mpl
oyee
:
Sign
ature
of m
anag
er: