+ All Categories
Home > Documents > Attorney-General’s Department Robert Garran Offices ... · PDF fileSection 2 The...

Attorney-General’s Department Robert Garran Offices ... · PDF fileSection 2 The...

Date post: 17-Feb-2018
Category:
Upload: lamkhue
View: 213 times
Download: 0 times
Share this document with a friend
46
Transcript

The Australian Public Service Commission has used its best endeavours to ensure the accuracy of this publication. The legislative framework and the other information covered by the guide may change from time to time, particularly where principles are based on court or AIRC cases. We will endeavour to notify agencies of significant changes through our website, when they come to our attention. The material does not constitute a comprehensive outline of the policy or legislation on every aspect of the APS Values and Code of Conduct, relevant to every situation.

For these reasons, we cannot guarantee the material is complete, correct, up to date, or relevant for your purposes. The guide should not be relied on as a substitute for detailed advice as a basis for making decisions. In any important matter you should seek professional advice relevant to your particular circumstances.

© Commonwealth of Australia 2005

ISBN 0 9757584 2 X

This work is copyright. Apart from any use as permitted under the Copyright Act 1968, no part may be reproduced by any process without permission. Requests and enquiries concerning reproduction and rights should be addressed to:

Commonwealth Copyright AdministrationAttorney-General’s DepartmentRobert Garran OfficesNational CircuitCanberra ACT 2600www.ag.gov.au/cca

ForewordIn 2004 the Australian Public Service was recognised as a world leader in public sector reform by the United Nations. These reforms in recent years have provided the flexibility and agility needed for a more responsive public service with a strong ethos of accountability and professionalism. A critical part of this is the quality, capability and capacity of our people.

In the times ahead the Service will face much change and many challenges. If we are to continue delivering a high standard of public service to achieve the results we want, we need to ensure our people are supported to develop the skills and knowledge they need.

Recent State of the Service and Management Advisory Committee reports highlighted the significant impact on the Service of the changing shape of Australia’s workforce. In the future we can expect our workforce to be more mobile, increasingly multigenerational and smaller. We will need to compete to attract the best and brightest, and work to retain them.

Learning and development will be an increasingly important aspect of our strategies to buildthe capability of our organisations. It represents a significant resource investment within agencies and across the Service, and we need to understand how well that investment is working to achieve the desired ends.

In 2002, in conjunction with the Australian National Audit Office, the Commission developed Building capability—a framework for managing learning and development in the APS, to assist agencies as they design and implement learning and development initiatives that meet their individual needs.

Evaluating learning and development—a framework for judging success completes the picture.It gives agencies a practical guide to assist them in developing effective approaches to evaluating their learning and development initiatives. It is supported by a series of user-friendly evaluation tools and additional resources, available from the Commission’s website.

I particularly wish to acknowledge the practitioners from a wide range of agencies who gave theirtime, energy and expertise to the development of this resource. To all who have contributed to this effort, my sincerest thanks.

Lynelle Briggs

Australian Public Service Commissioner

June 2005

Foreword

ContentsIntroduction 1

• Purpose 1

• Components of the guide 2

• Objectives 2

• Importance 2

• Perspectives 3

• The APS model for evaluating learning and development 4

• Critical factors for success 6

• Road map 9

Section 1

Decide and plan—key decisions in evaluating learning and development 11

Section 2

The learning and development evaluation maturityself-assessment 21

Section 3

Case study 29

Appendixes 37

1 List of web-based resources 37

2 References and resources 38

3 Glossary of terms 39

4 Abbreviations 41

Contents

PurposeThe booklet Evaluating learning and development—a framework for judging success is one part of the APS learning and development evaluation guide, which has been developed to provide practical support and guidance for agencies evaluating their learning and development. It provides a framework for ‘getting started’:

for making the key decisions about what and how to evaluate your agency’s learning and development

for assessing your agency’s maturity or organisational capacity to evaluate

for planning your agency’s overall evaluation strategy.

Working through the underlying key decisions set out in this framework will provide the basis for agencies to then choose appropriate evaluation processes and tools. The other part of the guide is a collection of practical resources (tools, templates, checklists etc.) which is available on the Commission website. Practitioners can either adopt these resources ‘as is’ or adaptthem to suit their agency’s specific circumstances and practices.

The guide is based on the model for evaluating learning and development contained in the better practice guide, Building capability—a framework for managing learning and development in the APS, published by the Australian Public Service Commission and the Australian National Audit Office in 2003.

Who is this guide for?The guide is primarily for human resource and learning and development practitioners. Practitioners can also use it when working with the senior executive and relevant line managersto inform the development of the agency’s overall evaluation strategy or an evaluation planfor a particular programme.

The framework can be used by practitioners at a number of levels:

to work with and involve the senior executive to develop the agency’s overall evaluation strategy—to review and inform the whole-of-agency approach to developing capability

to work with the relevant line managers to design an evaluation strategy for a broad programme of learning and development, e.g. leadership development across the agency

to design the evaluation of a specific learning and development activity, e.g. a half-day seminar introducing a new business system.

Introduction

1

Introduction

Components of the guideThe APS learing and development evaluation guide comes in two parts:

Part 1: Evaluating learning and development—a framework for judging success (this document)

Part 2: The practitioners’ website at www.apsc.gov.au/learn

The companion practitioners’ website is a collection of practical resources and tools that can be adapted and customised for use in agencies to best meet each agency’s specific business requirements.

Objectives of evaluating learning and development Assess if intended learning and development objectives have been met

Continuous improvement of learning and development

Assess whether resources are used wisely

Assess the value for money of the learning and development

Importance of evaluating learningand development

Effective evaluation is part of an accountable, professional and ethical public service. It is fundamental to good governance, good practice and good management.

Effective evaluation is an important and useful management tool to help facilitate and promote effective learning and development—vital for an increasingly interconnected, complex and contestable public sector environment.

Effective evaluation will ensure that an agency’s approach to people development aligns with its business goals and is good value for money.

1. 2.3.4.

2

Introduction

Perspectives on evaluating learningand development Evaluation of learning and development can reveal different levels of information, each with its own emphasis and contribution to overall evaluation results.

It is important to consider and clarify the overall purpose of the evaluation before starting to plan the evaluation strategy. The purpose can range from high level strategic issues such as assessing the impact on the agency's capability, to operational issues such as improving quality and efficiency of the delivery of specific programmes and improving participant's capabilities.

3

Introduction

4

Introduction

The APS model for evaluating learningand developmentThe model for evaluating learning and development, as set out in the better practice guide, Building capability—a framework for managing learning and development in the APS (published by the Australian Public Service Commission and the Australian National Audit Office in 2003), covers six elements of evaluation across three phases of the life cycle of a learning and development intervention.

The six elements are:

relevance

appropriateness

reaction

capability acquired

performance on the job

outcomes.

The three life-cycle phases are:

Line of sight phase (Before intervention)

Is the learning and development relevant and appropriate to the learning need, goals, context, culture, funding arrangements etc.?

Learning and performance phase (During intervention)

Is the learning and development well conducted and managed, and does it help learners gain and transfer the necessary capabilities?

Outcomes phase (After intervention)

Does the learning and development produce tangible and intangible results, and what impact do these have on individuals and the organisation?

Model for evaluating learning and development

Before intervention

During intervention

After intervention

5

Introduction

Source: Building capability—a framework for managing learning and development in the APS.

Critical factors for successAgency commitment and engagement

Extract and promote information from evaluations that assists managers to decide on:

the nature and type of learning needed for their people and teams

where to invest time, money and energy

how to improve the workplace culture to better support transfer and application of skills

the areas where succession and contingency planning need to focus

priorities for action, such as to develop new programmes, improve/modify existing programmes, reallocate resources.

Get line management input and sign-off for clearly stated learning outcomes and designs. Directly involving line managers in the planning and design stage will build their interest and clearly show their accountabilities for ensuring the desired results.

Provide feedback in ways that engage and involve the relevant manager/executive.

Adequate investment of time, money and resources

Make prudent decisions on evaluation an integral part of learning and development work plans. Be selective in what you choose to evaluate.

Allocate resources wisely to provide real results in areas that matterto the organisation.

Allocate the right amount of funds—proportionate to the end result desired.

Work with service providers to ensure that evaluation processes are built intothe design of learning and development. Help them to help you.

Use existing data sources and existing data gathering methods to lower costs. Add value and use data in more than one way to decrease repetitive bureaucratic processes.

6

Introduction

7

Introduction

Sustainable practice

Embed learning and development evaluation processes to make them standard business processes within the organisation.

Make important evaluation information a standard feature in executiveand management reports.

Ensure learning and development evaluation principles, strategies and practices are integrated within the agency’s HR practices—build a consistent approach to people management and avoid duplication.

Effective data, record and information management systems

Record and track minimum data sets such as expenditure and activity related data*.

Build in data entry tasks as an integral component of any learning and development activity.

Design data requirements and collection methods during the design phaseof learning and development.

Match data collection methods to the purpose of the evaluation.

Use multiple sources of data to provide a holistic picture.

Integrate and use human resource management information systems, learning management systems and financial management systems to support evaluation.

* Refer to the better practice guide: Australian Public Service Commission and the Australian National Audit Office 2003, Building capability—a framework for managing learning and development in the APS, for details on the recommended minimum data set.

The framework consists of three sections

Section 1:

Decide and plan—key decisions in evaluating learning and developmentProvides guidance in thinking about, making decisions about and planning for the evaluation of learning and development.

Section 2:

The learning and development evaluation maturity self-assessment Provides guidance on how organisational capacity to evaluate can be measured and incrementally developed.

Section 3:

Case study Provides a structured process for the practitioner to anticipate, think about and prepare for typical challenges and issues that may arise in evaluating learning and development.

Together, the three sections are designed to help agencies to consider the relevant issues, make decisions and plan the evaluation of their learning and development.

Once the focus of the evaluation is clarified through this planning process, practitioners can then select relevant tools and resources from the companion practitioners’ website to assist them in implementing the evaluation. Resources are available to support evaluation of each of the six elements: relevance, appropriateness, reaction, capability acquired, performance on the job, and outcomes.

www.apsc.gov.au/learn

8

Introduction

Section 1

Decide and plan—key decisions in evaluating learning and development

11

Why is evaluation required?

What will be evaluated?

How willinformation

obtainedbe used?

Who will have an interest

in the information?What data

will need to be collected?

How will this data

be collected?

When willthis data be collected?

How willinformation be

reported?

KEY DECISIONS

Section 1

Why is evaluation required?To

Determine success

Assess if objectives have been achieved

Make improvements

Ensure quality

Ensure accountability

Meet external requirements

Account for activity

Assess value or merit

Assess risk

Justify investments

Facilitate decisions whether to continue/discontinue the activity

Ensure appropriateness and alignment

Identify strengths and weaknesses

12

Section 1

ExamplesConsider

Importance of the intervention

Profile of the intervention

Relative cost of the intervention

Risk profile

Relative age of the intervention

Certified programmes & courses

Mandated

Why is evaluation required?

TipsBe clear on the primary purpose of the evaluation

Make sure the purpose is understood, shared and signed off by stakeholders

Activities which are important to key business goals or are key to meeting critical skill needs

High profile activities which are sponsored by key influential players in the organisation who may have particular or strong interests in the activity and the outcomes

Expensive, resource-heavy or time-consuming activity

Programmes which have novel or risky features or which have OH&S implications

New and untested programmes and activities

Competencies which require assessments and tests for certification or accreditation

Activities which are deemed compulsory by the organisation

Intent of the evaluationPlanning focus

13

Section 1 Section 1

What will be evaluated?Relevance

Appropriateness

Reaction

Capability acquired

Performance

Outcomes

TipsNot all areas need to be evaluated

Use The learning and development evaluation maturity self-assessment (see Section 2, pg 21) to gauge your organisation’s capacity to evaluate the intended areas

90%

90%

70%

20%

20%

10%

100%

100%

50%

100%

80%

10%

Scope of the evaluationPlanning focus

% of activities and programmes to be evaluatedEvaluation levels

Agency A exampleThis could be a policy agency with established highly experienced staff but significant change occurring in their relevant industry/sector.

Agency B exampleThis could be a fast growing agency with a significant number of new high profile responsibilities where service delivery must comply with complex legislation—programmes based on national qualification framework.

Relevance

Appropriateness

Reaction

Capability acquired

Performance on the job

Outcomes

What will be evaluated?

14

Section 1

How will information obtained be used?To

Improve the learning process

Improve relevant business areas

Improve decision making

Improve investment decisions

Improve organisational culture

Engage with stakeholders

Meet internal and external reporting requirements

Manage risk

Market activities

TipsUse the results proactively to inform and assist managers and individuals to better use learning and development.

Use the information to educate the organisation on the factors which help or hinder learning and development.

Outputs of the evaluationPlanning focus

How willinformation

obtainedbe used?

Who will have an interest in the information?The executive management group

Senior executives

Line managers and supervisors

Learners

Facilitators/coaches

Programme managers

HR managers and learning and development practitioners

15

Section 1

TipsFocus on groups thatare relevant and who will or can have a positive influence

Don’t wait to be asked for the information

Use the information in business cases and for business planning

Target audience for evaluation findings

Planning focus

Who will have an interest

in the information?

What data will need to be collected?Relevance assessments

Appropriateness assessments

Investment and expenditure

Participant and facilitator reactions

Capability acquired

Performance on the job

Business outcomes

Work performance data

Activity-related data

16

Section 1

What datawill need to be

collected?

TipsData requirements are driven by evaluation purpose

Create data gathering processes which easily become routine and simple

Use alternative and existing data sources such as staff surveys, performance management data etc. where possible

Design data gathering tools that can provide information for more than one purpose

Data requirementsPlanning focus

How will this data be collected?From

Relevant planning documents

Reaction sheets

Assessments and tests

Participant and supervisor surveys

Success and failure case studies

Individual learning action plans

Individual impact maps

Performance management processes

Interviews

Structured workplace observations

Work outputs

Workforce performance data

HRMIS or LMS

17

Section 1

How willthis data becollected?

TipsRecord and track data diligently

Use the human resource management information system (HRMIS) or learning management system (LMS) for data recording and tracking where possible

Use simple data collection methods

Use appropriate sampling techniques

Collect data from multiple sources

Uphold privacy requirements

Data sourcesand collection methods

Planning focus

When will this data be collected?During the different phases of the whole cycle:

Line of sight phase

Learning and performance phase

Outcomes phase

18

Section 1

TipsCollect data in a timely manner

Build evaluation into learning and development design

TimeframesPlanning focus

When willthis data be collected?

How will information be reported?As

'Dashboards' and scorecards

Activity measures

Performance measures

Cost benefit analysis

Return on investments

Impact assessments

Qualitative reports

19

Section 1

Use this information to:Type of evaluative data

Relevance data

Appropriateness data

Reaction data

Capability acquired data

Performance on-the-job data

Outcomes data

TipsReport in a timely manner

Report in a constructive manner

Speak the language of the audience

Demonstrate alignment of learning and development strategy with organisational goals and needs

Position learning and development as a lever for change Highlight needs which should be met by other (non-learning and development) means

Influence service providers for better outcomesMarket the programme and encourage employee interest Improve delivery and administration of the programme

Account for and demonstrate bench strength for succession planning, contingency planning and workforce planning purposesEncourage (planned) employee mobility

Help managers and supervisors better support their staff in applying learning and developmentObtain and build 'buy in' and support from line managers and supervisorsHighlight structural and process aids and barriers to on-the-job performance Inform learning and development procurement processes

Demonstrate value of learning and development to the organisationAssess linkages between positive outcomes and employee commitmentEnhance organisational branding as employer of choiceInform learning and development procurement processes

Report formatPlanning focus

How willinformation be

reported?

The learning and development evaluation maturity self-assessment The maturity self-assessment is a guide for organisations to develop their maturityin the practice of good learning and development evaluation.

It covers seven elements, six of which are based on the elements of the APS model for evaluating learning and development. The additional element relatesto evaluation strategy and planning which is an essential component for any mature evaluation capability.

Indicators

Each element is defined by several descriptors and illustrated by examples of observable outputs that can act as indicators of successful practice. These indicators are designed to enable concrete application of each element. Indicators can be viewed as desirable better practices.

The maturity self-assessment

The five-level maturity self-assessment is mapped against these indicators and is designed to help gauge, characterise and articulate the maturity of your agency against each indicator.

The maturity self-assessment can also be used to set incremental targets allowing each indicator to be applied with greater sophistication as may be required. The five maturity levels are grouped into three categories (indicated by different colours on the following pages) for more refined maturity assessments and the tracking of incremental improvement.

21

Section 2

Section 2

For each relevant indicator, cross the box that best describes the state of play in your agency. Then tick the box that best describes the desired state of play in your agency.

1. Evaluation decisions and planning

ElementDecide what to evaluate, the level to which evaluation is to occur and plan for successful evaluation.

This entails (descriptors)• Developing an evaluation strategy and plan

• Deciding what activities to evaluate based on accepted criteria

• Deciding what phases to evaluate based on evaluation objectives

• Support by the agency of evaluation activities

• Deciding how to use evaluation data effectively

IndicatorsWhat successful

practice of this

element would

look like

Tracking the maturity of learning and development evaluation in your organisation

Level 1Ad hoc

Practice is applied poorly or inconsistently and has an uncertain level of acceptance

Level 2Managed

Practice is performed and managed with some skill forcompliance reasons

Level 3Defined

Practice is defined, familiar, shared and skillfully performed

Level 4Integrated

Practice is embedded and seen as part of daily work and as adding real value

Level 5Optimised

Practice is continuously improved and adapted for agency outcomes

The APS learning and development evaluation maturity model

1. Evaluation activities are guided by an agency evaluation strategy.

2. The evaluation strategy describes the intended outputs and purpose of evaluation activity.

3. Decisions on which programmesto evaluate are based on accepted criteria.

4. Decisions on what to evaluateare based on the objectives of the evaluation.

5. Evaluation and data collection decisions for programmes are made at early stages of programme design.

6. Evaluation activities are supported by the agency leaders and managers.

7. Data gathering and information management systems are in place and used to support evaluations.

8. Evaluation data is used actively within the agency to improve programmes, decisions, processes and people development strategies.

22

Section 2

2. Relevance

ElementAssess how well proposed learning and development interventions address business needs, capability needs and individual needs within the agency.

This entails (descriptors)• Clarifying the purpose of any learning activity

• Identifying linkages to business goals

• Ensuring learning meets identified capability needs

• Situating learning within an organisational capability framework

• Assessing the potential of intended outcomes to contribute positively

to organisational goals or needs

IndicatorsWhat successful

practice of this

element would

look like

Tracking the maturity of learning and development evaluation in your organisation

Level 1Ad hoc

Practice is applied poorly or inconsistently and has an uncertain level of acceptance

Level 2Managed

Practice is performed and managed with some skill forcompliance reasons

Level 3Defined

Practice is defined, familiar, shared and skillfully performed

Level 4Integrated

Practice is embedded and seen as part of daily work and as adding real value

Level 5Optimised

Practice is continuously improved and adapted for agency outcomes

The APS learning and development evaluation maturity model

1. Agency capability requirementsat organisational, group and individual levels are understood and used as the basis for learning.

2. Learning objectives have line of sight to workforce planning needs and business goals.

3. Learning activities are approved by the agency.

4. Intended outcomes from learning have reasonable potential to contribute to improved performance at individual and/or group and/or organisational level.

5. Business cases are built for learning activities where appropriate.

23

Section 2

3. Appropriateness

ElementMeasure the degree to which the allocation of resources to learning and development is appropriate to identified needs and priorities.

This entails (descriptors)• Identifying the extent of integration of learning and development with other

HR strategies and business practices

• Describing the desired benefits

• Describing the scope of each intervention

• Obtaining quantitative and qualitative information about the level and nature of investment

• Assessing if the design of the intervention matches the desired culture and the needs of the target audience

• Identifying risks

• Identifying alternatives

IndicatorsWhat successful

practice of this

element would

look like

Tracking the maturity of learning and development evaluation in your organisation

Level 1Ad hoc

Practice is applied poorly or inconsistently and has an uncertain level of acceptance

Level 2Managed

Practice is performed and managed with some skill forcompliance reasons

Level 3Defined

Practice is defined, familiar, shared and skillfully performed

Level 4Integrated

Practice is embedded and seen as part of daily work and as adding real value

Level 5Optimised

Practice is continuously improved and adapted for agency outcomes

The APS learning and development evaluation maturity model

1. Learning is used to meet needs which can be met appropriately through development.

2. Alternative strategies are considered in meeting needs.

3. Learning activities are appropriate to the needs they are designedto address and the audience.

4. Learning is linked to business planning, workforce planning, performance management and career development processes.

5. Learning activities are appropriate to actual or desired organisational culture and context.

6. Allocation of resources is proportionate to the need that the learning is designed to address.

7. Adequate data sets on costs and activity are established and maintained.

8. Cost benefit assessments or value for money assessments are conducted where necessary.

24

Section 2

4. Reaction

ElementMeasure participant's and facilitator's immediate reactions to aspects of theintervention or learning and development activity.

This entails (descriptors)• Measuring participant reactions

• Measuring facilitator and programme manager reactions

• Using the data to make improvements

IndicatorsWhat successful

practice of this

element would

look like

Tracking the maturity of learning and development evaluation in your organisation

Level 1Ad hoc

Practice is applied poorly or inconsistently and has an uncertain level of acceptance

Level 2Managed

Practice is performed and managed with some skill forcompliance reasons

Level 3Defined

Practice is defined, familiar, shared and skillfully performed

Level 4Integrated

Practice is embedded and seen as part of daily work and as adding real value

Level 5Optimised

Practice is continuously improved and adapted for agency outcomes

The APS learning and development evaluation maturity model

1. Participant reactions are obtained when appropriate.

2. Facilitator and programme manager reactions are obtained when appropriate.

3. Data is used appropriately to improve activity-related aspects in a timely manner.

4. Data is reported to relevant stakeholders including supervisors and activity sponsors.

5. Data is used appropriately to generate organisational interest.

6. Data is used in time series comparisons where appropriate.

25

Section 2

5. Capability acquired

ElementEvaluate the success of the learning and development activities by measuring whether the individual(s) and/or the agency have acquired the capability, knowledge, attitudes or competency required.

This entails (descriptors)• Acquisition of skills, knowledge and attitudes occurs in a logical

and sequential manner

• Measuring if performance standards have been achieved through valid and appropriate tests or assessments

IndicatorsWhat successful

practice of this

element would

look like

Tracking the maturity of learning and development evaluation in your organisation

Level 1Ad hoc

Practice is applied poorly or inconsistently and has an uncertain level of acceptance

Level 2Managed

Practice is performed and managed with some skill forcompliance reasons

Level 3Defined

Practice is defined, familiar, shared and skillfully performed

Level 4Integrated

Practice is embedded and seen as part of daily work and as adding real value

Level 5Optimised

Practice is continuously improved and adapted for agency outcomes

The APS learning and development evaluation maturity model

1. Standards for desired skill performance are identified in learning activities where appropriate.

2. Standards are based on national frameworks where required or appropriate.

3. Appropriate assessments or tests are used to measure the acquisition of skills, knowledge or attitudes.

4. Opportunities for re-tests are built into learning activities where appropriate.

5. Assessments appropriately reflect the context in which the skills, knowledge or attitudeis to be performed.

26

Section 2

6. Performance on the job

ElementAssess individual performance on the job following learning and development activities.

This entails (descriptors)• Identifying what learning has been transferred and applied in the workplace

• Identifying how transfer and application of learning has occurred (or why transferred application of learning did not occur)

• Strengthening transfer and application of learning processes and mechanisms

IndicatorsWhat successful

practice of this

element would

look like

Tracking the maturity of learning and development evaluation in your organisation

Level 1Ad hoc

Practice is applied poorly or inconsistently and has an uncertain level of acceptance

Level 2Managed

Practice is performed and managed with some skill forcompliance reasons

Level 3Defined

Practice is defined, familiar, shared and skillfully performed

Level 4Integrated

Practice is embedded and seen as part of daily work and as adding real value

Level 5Optimised

Practice is continuously improved and adapted for agency outcomes

The APS learning and development evaluation maturity model

1. Transfer and application of learning mechanisms are devised at early stages of programme design.

2. Reasons for successful or unsuccessful transfer and application are understoodby all stakeholders.

3. Transfer and application of learning is measured where appropriate and at appropriate times.

4. Contribution of learning (and other variables) to overall performance is identified.

5. Data is used to improve transfer and application of learningprocesses and mechanisms in a timely manner.

6. Organisational variables which impact on transfer and application are identified and managed.

27

Section 2

7. Outcomes

ElementAssess outcomes achieved at individual, group and/or organisational levels. They can be positive, negative, or at times, ambiguous.

This entails (descriptors)• Measuring value for money

• Identifying accrued tangible and intangible results which lead to better business outcomes

• Assessing how the same or better outcomes could be achieved in a cheaper and/or quicker manner

• Identifying areas for improvement within the organisation and its culture

IndicatorsWhat successful

practice of this

element would

look like

Level 1Ad hoc

Practice is applied poorly or inconsistently and has an uncertain level of acceptance

Level 2Managed

Practice is performed and managed with some skill forcompliance reasons

Level 3Defined

Practice is defined, familiar, shared and skillfully performed

Level 4Intergrated

Practice is embedded and seen as part of daily work and as adding real value

Level 5Optimised

Practice is continuously improved and adapted for agency outcomes

The APS learning and development evaluation maturity model

1. Outcomes or results of transfer and application of learning (or lack of) are identified and quantified in ways that are meaningful to the organisation.

2. Factors other than learning which contribute to outcomes are recognised and identified.

3. Outcomes or results are analysed and framed in relation to business goals.

4. Outcomes or results are analysed for their value or benefit in relation to cost, or as returns on investment where appropriate.

5. Outcomes or results are reported and communicated to stakeholders.

6. Data is used to improve or guide future decision making.

7. Data is used to shape learning strategies.

8. Data is used to shape organisational context and culture to better support learning.

28

Section 2

Tracking the maturity of learning and development evaluation in your organisation

Evaluating learning and development— a case study This four-part case study is intended to help the practitioner anticipate, think aboutand prepare for typical challenges and issues that may arise in evaluating learning and development.

Use this case study:

to reflect on the issues raised and to think of how you would respond to the key questions

to generate discussion with your team or manager on how similar challengesand tasks in your organisation could be tackled

to identify, anticipate, prepare and plan for potential issues and activities thatmay arise during learning and development evaluation

to inform your stakeholders of evaluation-related issues so that they can better support your evaluation effort

to help new human resource and learning and development practitionersunderstand and plan for some of the complexities involved with learning and development evaluation in the APS.

29

Section 3

Section 3

Case studyPart 1: The big picture

Rebecca is an Executive Level 2 working in the HR branch of a medium-sized Australian Public Service agency which has policy and programme responsibilities. She is currently Director People Development and was promoted into the job after a three-year stint asa line manager in one of the agency’s more prominent programme divisions.

As line manager, Rebecca made it a point to actively support her three teams in their professional development and paid much attention to building their technical capabilities as well as their skills in working well within and across teams. A teacher by background, she has always valued learning.

Part of her vision for her new role is to ensure that operational areas within the agency are well supported with a variety of business-related learning and development opportunities. These would provide practical help to people working with the many complexities of policy development and programme implementation. Learning and development in the agency, according to Rebecca, is supposed to be about practical skill development in areas relevant to the agency’s core business.

While the branch has lead responsibility for many HR-related functions, processes and policies, the branch also works in close collaboration with two HR support cells that service two of the largest divisions in the agency.

These cells essentially provide operational support to divisional staff including personnel administration, recruitment, case management and co-ordinating learningand development activities in highly specialised technical areas.

The branch has a good reputation within the agency and most divisional heads have been appreciative and supportive of branch initiatives. Of late however, the branch has been under increasing pressure to get the agency’s HRMIS in order so that it provides useful workforce performance reports for the executive, and to streamline its menu of learning and development activities.

Some division heads have also mentioned on more than one occasion, the need for the branch to demonstrate the returns that the agency’s expensive middle-management programme is making on its investment. This programme is paid for byall the divisions based on their quota of participants. The larger divisions have felt the burden of having to invest more, given the higher proportion of their participants.

This pressure, together with the recent focus on learning and development evaluation by the Government and central APS agencies, has compelled the branch head to think about the need for an agency learning and development evaluation strategy. To ignore these pressures would be to risk divisional support and goodwill and the relevance of the learning and development function. Divisions would simply develop their own learning and development agendas if they felt their needs weren’t being met by corporate HR.

30

Section 3

As Director People Development, Rebecca has primary responsibility for the design, development and implementation of this strategy. She knows that this is a daunting task. The agency runs over 30 learning and development activities including:

competency certification courses

postgraduate programmes in policy development

‘study bank’—agency-approved time for formal study

supervisor development programmes

IT training

E-learning for OH&S and diversity issues

a middle management residential programme

an action learning set in a policy area

study visits of community stakeholder groups

self-paced study manuals

external attachments

occasional lunchtime seminars

basic APS procedural and administrative development programmes

executive coaching services for the senior executives

brokering external programmes.

Key questions to consider:

1. What can be done to begin developing the strategy?

2. What will be the key objectives of this strategy?

3. What criteria can be used to determine which programmes will be evaluated?

4. What criteria can be used to help identify the level of evaluation that would be required?

31

Section 3

Part 2: Issues around evaluating relevance

Rebecca does realise that any good learning and development evaluation begins with an assessment of the programme’s relevance. After all, it’s stated in the APS evaluation model! But she is not sure of how to actually test each programme’s relevance, especially now that processes like skills audits and needs analyses seem to have fallen out of favour with many learning and development practitioners and employees.

The branch did recently attempt to collate training needs data from its performance management processes, but as this is still paper-based and confidential, obtaining this data has been tedious with patchy results.

The need for learning therefore is very much shaped by a variety of interests and agendas. This has led to a purely reactive approach on the part of the branch and an unsatisfactory sense that the agency’s learning and development strategy is a patchwork of unintegrated events.

Key questions to consider:

1. How can relevance of a programme be tested for existing programmes and how can it be guaranteed for new programmes?

2. How can relevance of learning and development to organisational goals/objectives/outcomes be achieved?

Issues around evaluating appropriateness

The agency has been brave in being an early adopter of novel learning and development initiatives. For example, it was one of the first agencies in the APS to use an e-learning system for its OH&S training and to employ executive coaching for its senior executives.

On the face of it, the agency has a rich menu of formal and work-based learning activities. Rebecca is however, uncomfortable at the ad hoc nature of this ‘menu’ as are some division heads. They have started to question the appropriateness of some learning and development activities which they have to fund.

As well, increasing budgetary constraints, time and workload pressures throughout the agency mean that HR is increasingly under pressure to justify each learning and development activity it proposes.

Key questions to consider:

1. How can appropriateness of a programme be evaluated?

2. How can its appropriateness, once determined, be communicated?

32

Section 3

Part 3:

Issues around evaluating learners' reactionReaction level evaluation of learning and development is carried out within the agency. Reaction sheets are religiously handed out at the end of every activity and the HR branch has used this to make judgements on content, service providers and administrative arrangements. Findings are generally discussed within the branch and are often used to drum up participant interest within the agency.

Key questions to consider:

1. What is the purpose of the reaction sheets?

2. How can you use this reaction data in a more strategic or creative way?

Issues around evaluating capability acquired

The agency corporately funds several competency-based programmes which result in certifications at the certificate IV and diploma levels. Successful participants are generally considered technical experts within the agency. Participants seemto value the certification they receive at the end of their learning and developmentassessment.

Rebecca has felt for some time now that many other programmes would benefit from having some form of assessment to help ensure participants actually master the skills being taught. She is aware of people in the agency, for example, who can’t complete tasks even after attending a course. She is not quite sure how to go about doing this and is unsure of the reaction of employees to the idea of assessments in non-certified programmes.

Key questions to consider:

1. What activities are best suited for capability assessments?

2. What assessments can be introduced to ensure that skills are actually gained from these activities?

33

Section 3

Issues around evaluating performance on the job

There is little that the agency does formally at the moment to gauge if employees actually apply what they’ve learned from learning and development activities. Application on the job has always been seen by HR as being the responsibility of line managers, except where HR has helped implement work-based type activities. These could be action learning sets, and coaching which seemed to have obvious and visible work-based applications. Some information on application of learning, Rebecca assumed, would be captured in the agency’s performance management processes. However, as this is still paper-based and confidential to the individual, such information would be hard to obtain.

Rebecca does know anecdotally that many participants do seem enthusiastic after taking part in learning and development activities. But she also realises that this enthusiasm and energy dissipates quickly once participants return to their workplaces, and is keen to learn why this is so and how this effects the transfer of learning.

Key questions to consider:

1. How can your agency find out if learning has been applied on the job?

2. What would some barriers to application be?

3. What would encourage application of learning?

4. What can the branch do to more actively encourage the application or transfer of learning?

34

Section 3

Part 4:Issues around evaluating outcomes

At the moment, there is very little that the agency does to gauge formally the difference learning has made to its outputs and outcomes.

There is a commonly held view that the learning is generally valuable and beneficial to individuals and the organisation.

There has, however, been no formal effort to actually describe and report what these benefits might be and how valuable they actually are. Recent reporting has been limited to learning and development activitiy reports and Rebecca is not even confident that these are accurate as the HRMIS does not yet capture such data well.

Current reported data is often compiled from doing the ‘ring around’ with branches to find out how many of their people participated in formal learning and development activities over the previous quarter. The branch did recently create an Excel database for each division to record, cost and report their learning and development activities, but this is used inconsistently.

The agency’s last performance management cycle does seem to report an increase in performance ratings especially for the executive-level feeder groups. Rebecca andher colleagues are unsure if this is a result of participation in learning and development activities.

Several division heads have also queried the cost of the middle management programme and say that similar programmes conducted elsewhere are less expensive. While there is a general consensus that this programme is well conducted and received, and that the external service provider is an excellent partner, some senior executives do think that the agency is paying more than it’s worth. As a former participant on the programme, Rebecca does not agree with this view and would like to demonstrate that the programme is making an impact on individuals and their commitment to the agency. She does feel strongly that the investment made by divisions is worthwhile.

However, she is unsure how to go about demonstrating this with meaningful data. Concepts such as value for money, returns on investment and impact statements all seem useful, but she is unsure which would be most appropriate.

Key questions to consider:

1. How would you show that the programme is worthwhile?

2. What data would be useful to report to senior management? How can you start to get hold of this data?

3. Which evaluation approaches would be useful in helping demonstrate the programme’s impact?

35

Section 3

List of web-based resources These and other tools and resources are available on the Commission's website at: www.apsc.gov.au/learn

Evaluation elements Examples of tools/resources available

Decide and plan

Prioritisation matrixEvaluation plan templateTips on engaging stakeholdersTypes of learning and development failuresQuality checklistReferences and resources for evaluation planning

Line of sight phase

Relevance

Appropriateness

Relevance and appropriateness

Planning guide for evaluating relevanceRelevance index

Planning guide for evaluating appropriatenessAppropriateness indexOutput potential indexCost calculator

Tips when evaluating relevance and appropriatenessRelevance and appropriateness scorecardReferences and resources for evaluating relevance and appropriateness

Learning and performance phase

Reaction

Capability

Performance

Capability and performance

Participant reaction to learning and potential for applicationParticipant reactionsSample participant reaction evaluation questionnaire for a complex programme

Techniques to evaluate learning and performanceAccountable learning agreement

Planning guide for evaluating learning and performanceIndividual learning impact map Learning survey-for use on desktop Learning survey-for use as hardcopy

Structured observation logReporting on learning and performance evaluationTips when evaluating learning and performanceReferences and resources for evaluating learning and performance

Outcome phase

Planning guide for evaluating impactSuccess case methodLessons learned methodReturns on investmentImpact statementValue for money assessmentsTips when evaluating impactReferences for impact evaluation

37

Appendix 1

Referencesand resources

Australian Public Service Commission and the Australian National Audit Office 2003, Building capability—a framework for managing learning and development in the APS, Australian Public Service Commission and the Australian National Audit Office, Canberra.www.apsc.gov.au

Australian National Audit Office 2002, Managing people for business outcomes. Audit Report No. 61, ANAO, Canberra.www.anao.gov.au

Australian National Audit Office 2003, Managing people for business outcomes, year two. Audit Report No. 50, ANAO, Canberra.www.anao.gov.au

ACT Chief Minister’s Department 2003, ACT public service learning and development framework, CMD, Canberra.www.psm.act.gov.au/publications/L&D_framework-final.pdf

Commonwealth Department of Health and Aged Care 2001, Evaluation: a guide for good practice, Commonwealth Department of Health and Aged Care, Canberra.www.health.gov.au/internet/wcms/publishing.nsf/Content/mentalhealth-resources-evalua-tion

Centrelink 2004, Centrelink evaluation handbook: a guide for planning and conducting evaluations in Centrelink, Centrelink, Canberra.

Finance and Public Administration References Committee 2003, Recruitment and training in the Australian Public Service, The Senate, Canberra.www.aph.gov.au/senate/committee/fapa_ctte/completed_inquiries/2002-04.htm

Management Advisory Committee 2003, Organisational renewal, MAC, Canberra.www.apsc.gov.au

State Services Commission 2001, A framework for measuring training and development in the state sector: working paper no.12, State Services Commission, New Zealand.www.ssc.govt.nz

38

Appendix 2

Alignment Vertical agreement of strategies, structures, capabilities, technology and processes with corporate goals, cascading to lower level plans and activities. These components are interdependent and must be viewed in relation to one another to ensure they are congruent with the overall corporate directions and desired culture.

Coaching The practice of instructing, demonstrating, directing, prompting and encouraging participants. Generally concerned with methods rather than concepts.

Evaluation A systematic, objective assessment of the appropriateness, efficiency and effectiveness of a programme or part of a programme.

The process of gathering information in order to make good decisions. It is broader than testing, and includes both subjective (opinion) input and objective (fact) input. Evaluation can take many forms including value for money assessment, portfolio assessment, 360° feedback and self-reflection.

Governance Encompasses how an organisation is managed, its corporate and other structures, its culture, its policies and strategies, and lines of accountability.

Human resource/ People management A series of organisationally approved initiatives designed

to facilitate the effective management of people to achieve agency outputs and outcomes. This includes specific practice areas such as organisational development, workforce planning, recruitment and selection, performance management, learning and development, reward and recognition, workplace diversity, employee relations and occupational health and safety.

Integrate To make strategies and processes horizontally compatible—share common practices and data sets etc. to achieve efficiencies, avoid duplication and maintain a congruent image and seamless process for users.

Learning and development ' Learning and development' refers to all processes associated with the identification of agency and individual requirements in relation to capability development, and the design, delivery and/or brokering of opportunities to develop the capability of individuals and groups within the agency.

Appendix 3

39

Glossary of terms

Learning and development activity Any activity specifically designed to build the capability

of individuals. This includes a wide range of on-the-job and off-the-job processes, such as: structured programmes, formal education, networks and forums, job rotation, coaching and mentoring. A list of such interventions is found in Appendix 2, Building capability—a framework for managing learning and development in the APS, 2003, published by the APS Commission and the ANAO.

Mentoring Three mentoring roles can exist in a work context:

mainstream mentor—someone who acts as a guide, adviser and counsellor at various stages in someone’s career

professional qualification mentor—someone required by a professional association to be appointed to guide a student through a programme of study, leading to a professional qualification

vocational qualification mentor—someone appointed to guide a candidate through a programme of development and the accumulation of evidence to prove competence to a standard.

The mentor’s role includes:

acting as a sounding board

sharing benefit of experience and perspective

highlighting opportunities

providing opportunities where the individual could showcase their talent.

Performance indicators Information that can be used as the basis for determining the outcome, or impact, of particular learning and development activities or programmes—generally agreed with stakeholders ahead of time.

Workforce planning A continuous process of shaping the workforce to ensure that it is capable of delivering organisational objectives now and in the future. The desired outcomes of workforce planning are its effective integration into an agency’s strategic planning framework and the alignment of HR strategies to continuously deliver the ‘right people in the right place at the right time’.

Introduction

40

Appendix 3

ANAO Australian National Audit Office

APS Australian Public Service

HR Human resources

HRMIS Human resource management information system

LMS Learning management system

OH&S Occupational health and safety

41

Abbreviations

Appendix 4


Recommended