+ All Categories
Home > Documents > Using NRCan's Program Activity Architecture (PAA) as · PDF fileUsing NRCan's Program Activity...

Using NRCan's Program Activity Architecture (PAA) as · PDF fileUsing NRCan's Program Activity...

Date post: 20-Mar-2018
Category:
Upload: hahanh
View: 219 times
Download: 4 times
Share this document with a friend
14
Using NRCan's Program Activity Architecture (PAA) as the Framework for Evaluation Planning Presentation to CES-NCC Gerry Godsoe and Gavin Lemieux Strategic Evaluation Division Science Policy Integration June 2, 2008
Transcript

Using NRCan's ProgramActivity Architecture (PAA)as the Framework forEvaluation Planning

Presentation to CES-NCCGerry Godsoe and Gavin LemieuxStrategic Evaluation DivisionScience Policy IntegrationJune 2, 2008

2

Purpose1.To offer a perspective on evaluation planning.

2.To provide some information on NRCan’s multi-year evaluation planning – based on the PAA.

3.To seek your comments, experiences andsuggestions.

3

NRCan Overview A medium-size department that integrates science and policy. Minister: Gary Lunn; DM Cassie Doyle. 4,600 FTEs; $1.8B Budget; 20-25 G&C Programs. Formed in1993 and includes:

Energy (linked to EC and climate change); Earth Sciences (e.g., Geological Survey of Canada; Canada

Centre for Remote Sensing; Mapping); Canadian Forest Service (research and competitiveness); Minerals and Metals (close relations with industry).

NRCan brings science-based natural resources policy to thechallenges of economic, social and environmental development.

4

Background on Evaluation at NRCan

The Strategic Evaluation Division is a team-based organization with15 FTEs supplemented by contractors on Standing Offer.

The Division helps develop and approve the performancemeasurement and evaluation strategies in RMAFs/TB Submissions

Evaluation has just moved to Science Policy and Integration. TheDepartmental Management Committee, chaired by the DM, will serveas the Evaluation Committee.

Now let’s look back, way way back, …

5

Program Activity Architecture The PAA is TB’s latest attempt to have departments link objectives,

programs, resources and results in a comprehensible picture.

At its most basic level, the PAA is a means for departments to describe whatthey do to TB analysts, Ministers, Cabinet, and Canadian society.

The significance and difficulty of this task is often overlooked.

6

NRCan’s PAA and Evaluation Planning In response to the emphasis on expanding evaluation coverage,

and in anticipation of the New TB Evaluation Policy, NRCandeveloped a five-year evaluation plan starting in 2007-08 basedon 3 principles:

Use the PAA as a guide (at Sub-Sub or Sub-Activity levels); Cover off G&C programs within their boxes of the PAA; and Limiting the number of evaluations.

In other words, fewer, larger and more strategic evaluations.

At present, we are in the process of reconciling new pressures,such as the anticipation of Strategic Review, with a newEvaluation Plan.

7

The Issues and challenges so far … Use of the PAA for evaluation planning is still in its early days

at NRCan but already a number of issues have arisen:

Evolution of the PAA;Consistency Within the PAA;Size and complexity of evaluations;Financial information;Nature of reporting;Temporal element.

8

Evolution of the PAAExperience: 2007-08 PAA closely mirrored the organizational chart with sectors. G&C programs and other activities come and go, change names, etc.;

confounding tracking and evaluation coverage. 2008-09 PAA moves away from the “sector silos” to a more horizontal

approach, with some evolution towards a project-based PAA structure.

Lesson Learned: Evaluation planning will continue to be a challenge. 100% evaluation

coverage = “best effort at that point in time”; NRCan will be unable toevaluate discontinued programs; TBS will need to be flexible.

9

Consistency within the PAAExperience: Tendency is to attribute common characteristics to activities within

the same PAA unit that may NOT exist (e.g., history, world view,vision, logic, objectives, governance, management framework,performance indicators, financial and results reporting).

The PAA reflects the “best fit” and “lowest common denominator”principles as well as evolutionary realities.

Lesson Learned: Don’t assume that the current situation is based on logic. It may

simply reflect of years of starts and stops, responses to all kinds ofpressures and resource constraints.

10

Size and complexity of evaluationsExperience: Past NRCan evaluations were usually done on G&C programs or

components and relatively small in scale: Good = in-depth coverage for program renewal; manageable projects; Bad = too detailed; focused on improvements; of interest to managers

rather than ADMs/DM. Current evaluations cover tens of millions of dollars and several “programs”

with multiple managers, logic models/RMAFs and delivery systems.

Lesson Learned: Our evaluation assessments are taking months just to understand the

basics, talk to the managers, and start tracking down the documents, dataand financial information. Evaluations are taking over a year and this hasimplications for staff experience, fatigue and morale.

11

Financial informationExperience: NRCan does not yet have a common integrated financial and

results reporting system. Past financial data and records are based on responsibility centres

and organizational relationships, tempered by risk managementand central agency reporting requirements.

The financial systems are always in a state of flux attempting tocatch up with past policy and program developments.

Lesson Learned: Available financial information may only represent order of

magnitude information.

12

Nature of reportingExperience: We have been moving towards concise evaluation reports of

20-30 pages aimed at ADMs/DMs. Given larger size and complexity of PAA units, we will only be able

to provide 30,000 foot views with very limited information on G&Cs– certainly for the first cycle of evaluation coverage.

Lessons Learned: The first five-year cycle will provide an overview of PAA units (e.g.,

profile; history; existing performance measurement data; tentativeconclusions on relevance, success and cost effectiveness)unlikely to provide the definitive assessments desired by decisionmakers or the detailed cost benefit analyses desired by TBS.

13

The temporal elementExperience: Evaluations look at past performance in an attempt to predict

and influence the future. The past often represents a confusing picture reflecting the

increasingly rapid shifts in government policy (e.g., 2-3 yearprogram funding cycles) and relatively scarce resources todeliver, manage and report on activities.

Lessons Learned: Evaluations can not improve the past. All we can do is help

explain the past to the best of our abilities based on theavailable information and a neutral stance.

14

Conclusions The PAA presents an opportunity to link

evaluations with up-to-date strategic policy areasand with Parliamentary reporting.

The challenge is to balance evaluation coverage,depth and available resources.

We would welcome your experiences, reactionsand thoughts.


Recommended