1
FINAL REPORT LCC PERFORMANCE MEASURES FRAMEWORK DEVELOPMENT
Marni Koopman Brian Petersen
Jensen Montambault 8-‐30-‐2013
ABSTRACT This report outlines the process that was taken to develop Performance Measures for the LCC Network, as well as the outcome, which consists of a suite of measures. We have documented the process so that you can better understand the discussions and information that went into the development of a performance measures framework. Input came from interviews with LCC Coordinators, review of LCC documents, review of relevant performance measures frameworks from natural resource and socioeconomic sectors, interviews with users and designers of relevant frameworks, and a 2-‐day framework design “charrette” with the LCC Performance Measures Working Group. Some key features that were incorporated in the framework for the LCCs include flexibility (LCC units can change which measures they report on over time), choice (LCCs can choose which metrics to report on and how to measure them), collective reporting (LCCs are not forced to compete with one another), and meaningful measures (developed by the LCC Coordinators, Science Coordinators, and Partners). Based on extensive review of other frameworks, including specific attention to what worked and did not work, the LCC Working Group decided to fashion the measures based on the LCC Network Mission Statement, which has 5 main bullets. Each LCC was given an opportunity to provide input and design measures that address the LCC mission statement and that they would choose to report on. Some measures were chosen by most LCCs that chimed in to the process, and these measures might be recommended as common measures for all LCCs. Others might be reported on by some LCCs while not others. This design allows individual LCCs to maintain autonomy while providing measures and metrics that, when taken together, will provide a means to gauge the performance of the LCC network. This report provides both a description of the process that brought us to this tracking structure, and the LCC Performance Measures Framework, with the specific metrics that LCC Coordinators have suggested. BACKGROUND Creating effective performance measures (PM) poses many challenges. In sectors ranging from natural resource management to education and public health, PM have been implemented to ensure programs meet stated goals and objectives. In business, PM are implemented in order to increase productivity and profit. Tracking provides necessary insights into how well an organization or network is functioning and helps to allocate appropriate and effective use of limited funds. But identifying effective PM often remains an elusive goal throughout natural resource sectors – sought after but rarely found. The LCC network has identified establishing PM as a primary need and goal (LCC Charter 2012).
2
In August 2012, our team of three researchers from outside the LCCs were awarded LCC funding to help the LCC Performance Measures Working Group develop performance measures for the LCC. This report summarizes our efforts to facilitate a process to eventually establish workable and meaningful PM. Based on interviews and careful review of LCC documents, we identified 12 criteria with which to assess potential PM Frameworks. Using these criteria, we whittled 48 possibilities from throughout natural resource and socioeconomic sectors down to 5 top contenders. We then interviewed people who had applied or worked closely with those 5 frameworks to identify their strengths and weaknesses. We turned back to the LCC Performance Measures Working Group for guidance on how to proceed with developing measures for the LCC network. After extensive discussion about the purpose of measures at the network level and the differential maturity of the different LCC units, the Working Group decided to move forward with a suite of measures that can be used by OMB to track progress of the LCC network, but with some flexibility available to individual units in choosing which metrics to report. The process and framework that was developed is described in this report, as well as a request for additional assistance from all of the LCCs in identifying which metrics to include in the final suite and to provide feedback about the proposed approach. DEVELOPING PM FRAMEWORK CRITERIA Interviews – To begin, we interviewed 21 LCC coordinators to provide them an opportunity to share their perceptions regarding PM. This process helped identify the characteristics that the coordinators prioritized in PM, as well as potential pitfalls to avoid (Table 1). Interviews with 21 LCC coordinators provided important insights into how they view performance measures. Near universal support exists for creating and implementing meaningful performance measures. However, coordinators expressed concerns and a diversity of opinions on the best way to make the measures meaningful. Primary challenges and concerns coordinators identified included:
• Top-‐down performance measures could undermine the autonomy of the individual LCC partnerships.
• Different stages, funding levels and geographical, biophysical, and cultural differences of individual LCCs should be reflected in the context of any universal metrics.
• Measures should assess outcomes not outputs. Despite these challenges, coordinators support working towards establishing meaningful measures. Support varied with respect to:
• Levels of skepticism • Whether measures could include non-‐conservation outcomes, such as innovation • Whether available resources were sufficient • Existing requirements that often lead to irrelevant or duplicative measures • Utility and validity of network-‐wide or common metrics
3
Document Review – We also reviewed the mission, goals, and objectives of LCCs that had documented them online. Our review resulted in a list of potential priorities for PM based on the number of LCCs with similar objectives and goals (Table 1). Table 1. Categories of stated mission, goals, and objectives documented in online materials for individual LCCs and the LCC network. These priorities helped shape our criteria for evaluating PM Frameworks. They can also help guide the development of specific PM for individual LCCs and the network as a whole. In the last column, categories with 0 LCCs that mention them (such as consistency across LCCs) were only mentioned in LCC network documents rather than at the unit level.
Mission, goals, or objectives categories # Mentions # LCCs that mention
Inform conservation efforts/resource management 108 20 Conservation of fish, wildlife, and plants 95 20 Provide science/identify science needs 82 18 Increase/promote collaboration and coordination 65 19 Work across broad scales 52 16 Provide tools/technical support 43 13 Climate change adaptation (natural systems) 40 15 Other stressors 29 14 Evaluating outcomes/effectiveness 21 8 Conservation of cultural resources 19 9 Adaptive management/Strategic Habitat Conservation 19 7 Communication 17 12 Inventory and monitoring 16 8 Consistency across LCCs 11 0 Conservation of Ecosystem/Ecosystem Function 9 4 Networking across LCCs 6 2 Climate change adaptation (human systems) 5 3 TEK 2 1 Connectivity and Movement 1 1 Best Practices 1 0 Education 1 1 Policy recommendations 1 1 Interdisciplinary (natural systems and society) 1 1 Business processes 1 1 Criteria – From the LCC coordinator interviews and PM framework analysis we compiled a list of 12 criteria essential to crafting appropriate PM for the LCC network. We determined that the PM Framework should:
1. Support and result in measures that are meaningful to LCCs 2. Inform federal funders without making LCCs federal
4
3. Inform federal funders without top-‐down mandates 4. Reflect autonomy and diversity among LCCs 5. Reflect partnership and collaboration 6. Help unify LCC network rather than causing competition 7. Be iterative and flexible 8. Be sensitive to indigenous people as well as other partners 9. Focus on outcomes rather than tasks or processes 10. Accommodate differences among LCCs (in stage of development, types of resources or
stressors, and rate of change) 11. Consider technological needs and limitations 12. Accommodate change over time as needs and trends change 13. Be sustainable even with staff turnover
These criteria represent those elements that any PM framework for the LCC network should strive to embody. They serve as a basis upon which to judge any proposed PM. Not all PM selected will necessarily meet one or all these criteria, but the suite of PM selected should reflect these criteria and address most of them. We may need to adapt existing frameworks to address unique needs of the LCCs. ASSESSING EXISTING PM FRAMEWORKS We searched for PM frameworks implemented in natural resources and socioeconomic sectors. We reviewed 48 PM frameworks from natural resource management, education, business, health, international development, and the non-‐profit arena to identify specific features that lead to effective PM. From those 48 frameworks, we focused attention on nine that had the most relevance to the LCC network. After carefully reviewing each, we focused attention on six for more thorough analysis. Those six included:
1. U.S.D.A. Forest Service’s Land Management Planning, Monitoring, and Evaluation Framework 2007.
2. World Wildlife Fund’s Standards of Conservation Project and Programme Standards 3. Parks Canada’s Ecological Integrity tracking framework (the model for NPS’s
performance measures framework as well) 4. Annie E. Casey Foundation’s framework for measuring performance of their diverse
suite of programs to help disadvantaged youth 5. Black and Groombridge’s framework that adapts a business approach (European
Foundation for Quality Management) to fit the needs of conservation 6. The Nature Conservancy’s “Conservation Impact Measures”
(https://www.conservationgateway.org/Documents/CBP_Guidance.pdf) The unique attributes embodied by the LCC network make applying an existing PM framework impossible. Instead, we analyzed these six frameworks in an effort to garner insights applicable to the network. These six frameworks each harbor unique elements relevant to the LCCs. At our March meeting with the PM Working Group in Washington D.C., we collaboratively developed a framework based on the component pieces outlined below.
5
Below we discuss some of the specific elements from each of the six frameworks that were especially relevant to the LCCs. No one framework is perfect – they all have intriguing and promising features that could be combined into a final product for the LCCs. 1. United States Forest Service – Monitoring Framework to Support Land Management Planning Description: The USFS National Monitoring and Evaluation Framework (MET) seeks to establish a system to monitor progress towards the agency meeting desired conditions and objectives agency wide. Similar to the LCC network, the USFS has disparate lands with vastly different ecological systems (forests and grasslands) and works with communities and partners that have varied interests and concerns. This variation has made establishing a framework with common goals and reporting difficult. The agency’s multiple use mandate makes identifying, let alone establishing and tracking, PM a real challenge. The USFS has approached PM in this way: “Land management planning is an adaptive management process requiring evaluations of social, economic, and ecological conditions and trends that contribute to sustainability and that, therefore, reflect progress towards the land management goals for each NFS unit. Monitoring efforts and evaluations characterize key social, economic, and ecological performance measures relevant to a plan area.” The agency has used monitoring as its PM.
Key Attributes: The USFS example provides several important insights that the LCC Network can draw from in crafting their own PM.
(1) Rather than simply draw on PM crafted in the natural resource sector, the USFS incorporated corporate performance measures where applicable to establish their framework.
(2) The USFS framework established six core themes from which to craft PM: 1) Conservation of biodiversity; 2) maintenance of land health and vitality; 3) conservation and maintenance of soil, water, and air resources; 4) maintenance and enhancement of social systems; 5) maintenance and enhancement of economic system; and 6) infrastructure capacity.
(3) The framework directly acknowledges that “broadly accepted performance measures are essential to collaborative assessment, planning, and decision-‐making processes that address shared concerns.” Without having broadly accepted PM it remains unlikely that coordinators will use them to their fullest potential. Top-‐down measures rather than collaboratively developed measures may lead to a lack of compliance among disparate groups. 2. World Wildlife Fund – Standards of Conservation Project and Programme Standards (adapted from Open Standards for Conservation) The World Wildlife Fund, with conservation projects across the globe, has established standards to help guide management actions. Designed as best practices that project managers can draw from, they allow for flexibility in determining which actions to take given local conditions. This Results Based Management approach includes creating a framework to facilitate collaboration among conservation partners in project design, tracking outcomes and developing a
6
performance and learning culture. The performance and learning culture has particular relevance to the LCCs. The WWF has many conservation projects that include efforts by collaborators and partner organizations or programs. This makes tracking performance measures more difficult compared to tracking outcomes from a single entity. WWF instituted a Conservation Measures Program with monitoring and reporting systems tailored to the specific challenges posed by disparate conservation programs working on projects worldwide. The program established key performance indicators and a related tracking database system. The WWF process includes establishing a clear vision rather than individual tasks, incorporating new ideas to refine a project, openness to frankly discuss alternative views and perspectives, rewarding new ideas, sharing ideas across programs, prioritizing performance tracking and discussions on how to improve it occur at all levels, and elevating learning so everyone recognizes it as a legitimate and important activity. The WWF process specifically entails internal reviews and audits conducted by project or organization team members, as well as external reviews conducted by third parties to provide multiple evaluations. Key Attributes: The learning and performance culture embodies elements sought by LCC coordinators:
(1) It recognizes the need to allow local managers the ability to craft their own strategies and indicators to measure progress;
(2) It recognizes the challenges inherent in working with collaborators and partner organizations and provides a means by which to incorporate their interests and sentiments;
(3) It provides a means to aggregate PM from individual projects across to the organization as a whole;
(4) It utilizes an iterative process for identifying goals and ways to track them, while also promoting learning and flexibility;
(5) It acknowledges the need to take a broad view with regard to goals and performance tracking, focusing attention not on small scale outputs but on broader outcomes the organization seeks to achieve. 3. Parks Canada – Ecological Integrity and Reporting Program Canada has established the priority to maintain and restore ecological integrity across its national parks. A performance expectation, then, has to do with improving ecological integrity across parks. To do so, Parks Canada has instituted ecological integrity indicators and established performance expectations to track them. In particular, 26 Canadian parks must improve one ecological integrity indicator by 2014. A monitoring and reporting program designed to use science based data tracks progress toward success. This system requires all 42 national parks to write a State of the Park Report every five years.
7
The indicators used to track progress towards restoring ecological integrity take several forms. On a broad level, large ecosystems, such as forests or wetlands, represent indicators. Science based monitoring represents a PM in this system. To track these systems Parks Canada inventories species and ecological characteristics of importance, create measures to monitor ecological structure and process, and identify key stressors to these systems. This process has at its core established performance expectations that each unit has responsibility for. It also establishes a clear mechanism to determine whether monitoring and other performance activities have led to improved ecological integrity. Key Attributes:
(1) Each park has the ability to identify and track indicators that reflect their local situation and interests;
(2) The Poor, Fair, Good tracking system focuses on trends over time, providing tangible information on effectiveness in a manner that is both easy to understand and track;
(3) This system, already in place in Canada, has yielded information that is useful to both managers and the public at a low cost to implement, both in time and money;
(4) A rotating evaluation process ensures that each park has an external review to ensure progress and integrity that also reduces time spent by managers on self-‐evaluation.
(5) This system allows each unit to track relevant indicators but also enables those indicators to scale up to the national level, showcasing progress on performance without relying on narrowly focused indicators. 4. Annie E. Casey – A Road to Results The foundation has created a PM handbook to help guide and assess its education investments. Funds from the foundation go to diverse groups (i.e. schools, non-‐governmental organizations), making tracking outcomes difficult. To address this difficulty, the foundation has established a PM tracking system at the grantee, program and foundation level. The system focuses not only on PM but also performance goals: “desired levels of results on specific performance measures within a set time frame.” Rather than focus on outputs (i.e. number of customers served), this approach also assesses the outcomes associated with them (number of customers served well). The foundation has established a PM matrix that links activities to their effectiveness. This approach also has a unique characteristic not found in the other frameworks discussed above. Rather than viewing all PM equally, the foundation explicitly states that not all PM are created equal. Instead, they recognize that some have more importance or relevance than others and strive to acknowledge this hierarchy in assessments. Another unique attribute relates to what PM they use. The foundation has established common measures that all grantees must track but they also have established a strategy that enables
8
grantees to create their own optional PM. This provides each entity the ability to establish and track indicators that they deem important. Key Attributes:
(1) The foundation has established common and unique PM to both serve the foundations’ and grantees’ needs;
(2) Rather than an open ended PM, those used in this framework have specific timeframes accompanying them;
(3) The PM have relative importance levels associated with them showcasing those more important than others;
(4) The foundation created a database that enables each partner to track their PM and create unique PM to fit their needs. 5. Business Excellence Adapted to Conservation (Black and Groombridge 2010) This model for measuring performance was adapted from the European Foundation for Quality Management (EFQM). This model is the most widely used framework around the world, and has resulted in consistently good financial results, effective operations, and satisfied customers. It is based on a suite of beliefs and behaviors that include visionary leadership, focus on results, management by fact (not philosophy), systems perspective, valuing employees, societal responsibility, and continuous improvement. The model is represented as a nine-‐box model where each box represents a unique component of management practice. This nine-‐box system was adapted to better reflect the language and needs of conservation by Black and Groombridge (2010), resulting in a “Conservation Excellence” framework. The framework reflects the assertion that continuous improvement is the key to excellent conservation
Figure 2. The nine-‐box system from EFQM is translated into conservation terminology for Conservation Excellence (from Black and Groombridge 2010). The model is color coded by people (white), process (black), and performance (gray) for better understanding of the links between process and outcomes. The size of each box represents its potential relative importance.
9
outcomes. The Conservation Excellence framework consists of a series of subcriteria that are measured within each of the nine boxes of the model, resulting in scores for each box as well as an overall score. Different criteria or different boxes of the model can be weighted based on a variety of factors unique to each unit (or LCC). The variable weighting of the scoring system reflects variation in priorities, time frame, conditions, values, and other factors. When compared against the Open Standards framework (used by WWF and other conservation organizations), the Conservation Excellence model was superior in its consideration of human resources and budget management (Black and Groombridge 2010), as well as providing a more comprehensive assessment of a conservation program as a whole (whole system approach). Key Attributes:
(1) Increases understanding about the links between approaches taken (tactics or actions) and results
(2) Learning and improvement are integrated into the design, management, and evaluation of a program, its organization, and its systems.
(3) Creates a learning environment that allows for flexibility and adaptation to changes in knowledge or trend
(4) Efficient focus on relevant information
(5) Balances consideration of short-‐ and long-‐term pressures and trends
(6) scoring system that is transparent, flexible and responsive to change 6. The Nature Conservancy – Conservation Impact Measures (CIMs) The Nature Conservancy has recently evolved from the Conservation Measures Partnership approach to measures, and is trying a different approach. Among the reasons for this change was that a site-‐based system tracked in a proprietary software system was proving to be unwieldy and hadn’t provided the flexibility for either “scaling-‐up” results or aggregating (“rolling up”) monitoring data from fairly independent, site-‐based efforts. The Conservancy had also found that mandated measures that might be easy to aggregate were inhibiting collaboration, placing the cost of data collection on those least able to reap the benefits of the aggregation. In addition, a red-‐yellow-‐green coded measures system was found to create a false sense of objectivity for what were really qualitative self-‐assessments. TNC’s new approach acknowledges complexity and the influence of outside factors on conservation outcomes. It integrates flexibility, differences among projects, and the need for iteration. All global and regional priorities are required to develop and report on a limited set of CIMs that measure the changes the project or strategy aims to achieve in human and ecological
10
systems in five categories -‐ ecological, people, policy, management and practice, and sustainable finance. The global and regional priorities report CIMs via online dashboards. The intended audiences for the dashboards are the Senior and Regional line management, the Executive Team, Board of Directors, and selected key Conservancy supporters, with a purpose similar to that of an annual report by a company to its stockholders. The Conservation Impact Measures reported are specific to each global and regional priority. The Conservancy does not strive to aggregate or roll-‐up CIMs across priorities, nor is there an expectation that regional-‐level results will roll up to global level results via common impact measures across scales.
Key Attributes: (1) TNC-‐CIMs focus on 1-‐2 of the most important or salient outcomes in five categories (ecological, people, policy, sustainable finance, and management & practice). It is expected that programs will use other input and output measures for their day-‐to-‐day budget and work plan management.
(2) Actual metrics are iterative and flexible, units and timeframes may change over time as conservation strategies grow and mature. There is no top-‐down mandate to measure certain indicators with the intention of aggregating or “rolling up” across multiple strategies, except in such cases as a geography, thematic or regional program might require such information for management purposes (e.g., all projects with a forest carbon strategy use third-‐party standards to measure certain ecological aspects of their impact).
(3) The framework explicitly acknowledges the challenges of complexity across large landscape
Adapted from: Robert Chipimbi and Simon Hearn. 2009. Outcome Mapping: bringing learning into development programs.
11
conservation and attribution of success in partner-‐based endeavors.
(4) The framework was first piloted in September 2011 and has undergone several iterations of senior management review with TNC’s most mature projects. It is scheduled for mainstreaming (to all priorities) over the course of 2013-‐14. INTERVIEWS WITH PERFORMANCE MEASURES DEVELOPERS AND USERS Based on our review, we were optimistic about the ability of these frameworks to fill the needs of the LCCs. We found most of the important criteria to be well represented and found solid examples of those criteria being met in other situations and organizations. When we interviewed those who had developed and/or applied these frameworks in practice, however, our optimism waned. Scaling – What we discovered was that none of the frameworks had been successful at capturing performance at both the local (fine scale) and regional (course scale) level. The Forest Service, for example, put tremendous time and resources into designing a scalable framework and piloting it, only to have it “flop,” as one interviewee related. They continue to struggle with the need to have local measures as well as national measures and how to coordinate those in a meaningful and efficient manner. Similarly, TNC abandoned a previous “scalable” approach for a framework that better reflects the diversity inherent in individual projects and regions. Output vs. outcome – Many organizations had trouble tracking their outcomes and had reverted to primarily tracking output. For instance, Annie E. Casey tried to track (1) how much they did, (2) how well they did it, and (3) what difference it made. But they had to remove #3 from their measures because of the difficulty in determining what to hold themselves accountable for (see previous figure, where the outcomes are outside the sphere of influence). This is an issue especially pertinent to the LCC network. Goals alignment – The Annie E. Casey users also found that having alignment in goals among different efforts was quite important to being able to track performance over time, and that when different projects were not aligned, tracking became more challenging. They worked to bring diverse groups together so they were more aware of their common goals and how they were all trying to contribute to something larger. Parks Canada created a small number of common goals that individual parks could choose to strive towards, allowing for alignment across different subsets of parks. Dedicated resources – Time after time we heard that there was not enough resources dedicated to measures, and that led to failure. Vital to success is having dedicated and substantial resources for performance tracking. In addition, having an outside organization conduct the data collection, analysis, and reporting was often cited as a positive benefit, providing increased consistency, continuity, and reliability. Integration of learning culture – Finally, we heard that one important feature of successful measures is to integrate them into the culture or process of the organization to such an extent
12
that they are automatically considered. Thus, measures become part of the project design phase, and analyzing them becomes part of an overall learning culture (an integral part of adaptive management). Many organizations had negative experiences related to patchy collection of data, or unfunded measures efforts, which would lead to inconsistency in data collection. Instead, groups like the World Wildlife Fund recommend designing measures into every project/region/unit and having the larger network come together once a year to coordinate and share experiences and learning. LCC PERFORMANCE MEASURES WORKING GROUP DESIGN “CHARRETTE” On March 13-‐14, 2013, we met with the Performance Measures Working Group (composed of LCC Coordinators and Science Coordinators) as well as national LCC staff, and other relevant parties from NOAA, BLM, NCTC, USFWS, and OMB. This meeting provided us an opportunity to discuss the strengths and weaknesses of the 6 focal frameworks, hear from Tomer Hassan (OMB) about their need for measures for the LCCs, discuss how OMB measures might relate to SIAS measures, and piece together a framework that best meets the LCC criteria. LCC NETWORK APPROACH TO MEANINGFUL MEASURES In light of all that we learned from our review of PM frameworks from diverse sectors, in addition to a conversation with the Office of Management and Budget (OMB) about their needs for funding allocation for the LCC Network, the LCC PM Working Group decided to create a simple suite of measures that had the following characteristics:
1. The measures will be focused on meeting the needs of OMB for the entire LCC Network, leaving individual LCC units to develop their own measures for learning and adaptive management purposes;
2. The measures will focus on showing the “value-‐added” benefit of the LCCs to conservation;
3. The measures will be reported collectively, so as not to create competition among LCC units;
4. Individual LCCs will have the opportunity to craft their own measures to make sure they are meaningful;
5. The measures will link directly to the 5 bullets of the LCC Mission Statement, with each LCC reporting on 2-‐5 bullets;
6. Quantitative measures are desirable for OMB’s purposes, but telling stories with measures is also important, so qualitative measures with narratives are also being solicited.
Once we decided on how the group wanted to approach measures for the LCC network, we revisited the LCC Mission Statement and worked to craft measures for each LCC unit. We solicited measures first from the LCC Coordinators in the PM Working Group and next from all LCC Coordinators. We received input from the following 11 LCCs – California, Arctic, South Atlantic, Peninsular Florida, Desert, Great Northern, Upper Midwest Great Lakes, Northwest Boreal, Aleutian and Bering Sea Islands, Pacific Islands, Western Alaska. Coordinators were asked to submit measures that they are willing to report on at the network level (collectively),
13
for purposes of reporting to Congress and OMB. Many measures were appropriate across more than one of the 5 bullets in the Mission Statement. We then combined measures that were highly supported and similar enough to be combined, but that were originated under different bullets or by different LCC Coordinators, for a total of 6 measures. Those combined measures are presented below, but the original suite is also provided for additional information and so the Performance Measures Working Group can continue to refine the approach, as needed. Note on OUTPUT vs. OUTCOME – Most of the initial measures listed here are output measures, even though we identified outcome measures as being more meaningful. We encourage LCCs to begin to develop outcome measures that stem from the output measures over time. Because so many LCCs are relatively new, most are not ready for outcome measures at this stage in their development. There are some suggested for each of the 5 bullets in the notes that follow the recommended measures, and these could be used by a subset of LCCs, or could be incorporated into the reporting structure over time. In general, there was a higher level of agreement on output measures, so they are listed as the “Proposed” measures, but many good outcome measures are listed in the notes that follow. The LCC Mission Statement: A network of cooperatives depends on LCCs to:
• Develop and provide integrated science-‐based information about the implications of climate change and other stressors for the sustainability of natural and cultural resources;
• Develop shared, landscape-‐level, conservation objectives and inform conservation strategies that are based on a shared scientific understanding about the landscape, including the implications of current and future environmental stressors;
• Facilitate the exchange of applied science in the implementation of conservation strategies and products developed by the Cooperative or their partners;
• Monitor and evaluate the effectiveness of LCC conservation strategies in meeting shared objectives;
• Develop appropriate linkages that connect LCCs to ensure an effective network.
14
PROPOSED PERFORMANCE MEASURES FOR THE LCCS (DEVELOPED TO ADDRESS THE 5 BULLETS ABOVE) Performance Measure #1 – Website use and online mechanisms
a) Number of unique hits, visits, and downloads from LCC websites and portals, as well as other sites hosting LCC-‐related data and documents. (output)
b) Number of mechanisms (like atlases, conservation planning tools, projections, etc.) (output)
Addresses Bullets #1 and #3 (use and exchange of information) Performance Measure #2 – In-‐person and electronic communication of information
a) Number of webinars, meetings, workshops, and forums that address common conservation objectives, future scenarios, vulnerability assessments, adaptation strategies, stressors, and adaptive management. (output)
b) Number of webinars, meetings, workshops, and forums that provide science and tools. (output)
c) Number of participants in LCC webinars, meetings, workshops, and forums (output)
Addresses Bullets #1, #2 and #3 (use and exchange of information, shared objectives) Performance Measure #3 – LCC-‐funded projects
a) Number of projects funded by LCCs (output) b) Joint projects across LCCs (output) c) Number of organizations and entities that the LCCs are working with (output)
Addresses Bullets #2, #3 and #5 (exchange of information, shared objectives, and linkages) Performance Measure #4 – Outreach
a) Number of presentations (for partners or conferences), webinars, etc. on the LCCs, their products, projects, and strategies (output)
b) Number of attendees at presentations (for partners or conferences), webinars, etc. on the LCCs, their products, projects, and strategies (output)
c) Number of websites, Facebook pages, etc that mention the LCCs (output)
Addresses Bullet #3 (exchange of information) Performance Measure #5 – Linkages among LCCs and others
a) Number of joint projects with more than one LCC (also see #3b above) (output) b) Number of multi-‐LCC and LCT conference calls, meetings, decisions up for voting,
other coordination efforts (such as coordinated adaptation strategies) (output) c) Number of projects generated by similar projects from other LCCs (output)
15
d) Coordination with CSCs, USGS, and others (output? outcome?) e) Number of integrated adaptive management science teams/efforts (output)
Addresses Bullet #5 (linkage) Performance Measure #6 – Efficacy (on-‐the-‐ground conservation delivery)
a) Case study examples of how LCC activities led to the implementation of conservation/adaptation strategies (outcome)
b) Testimonials from partners/case study examples of how information exchange, facilitated by the LCCs, led to the implementation of adaptive management (outcome)
Addresses Bullets #3 and #4 (exchange and effectiveness)
16
Appendix 1. NOTES from the collection of information from LCC Coordinators on Performance Measures based on the 5 bulleted statements of the LCC Mission Statement. This is a compilation of all input from 11 different LCCs (color coded below).
Note from Marni – ALL metrics are to be used COLLECTIVELY – not to pit one LCC against another! Bullet 1. Partners are using LCC-‐derived integrated science-‐based information about the implications of climate change and other stressors for the sustainability of natural and cultural resources. (note: the actual bullet in the mission statement is about developing and providing information; whether or not people use it is out of the control (IS IT?) of the LCC, so there was one comment about rewording this bullet to reflect that LCCs develop and provide the information, rather than people using it).
Output • Number of individuals that use LCC-‐derived products, based on survey results.
Survey COULD be done by a third party and yield interesting results about what’s getting used and what’s not (but do they need official permission to do surveys, and is that a problem? And can this be done quickly enough based on a formal survey? And will people answer? There is a lot of doubt about the viability of this metric) GNLCC, SALCC
• Number of resource planning processes informed by LCCs (this needs more detail to clarify what, specifically, is meant by “informed by,” how many planning processes would be expected, and what numbers would be considered success) DESERT LCC; ABSI
• Number of hits, unique visits, and downloads from LCC websites and portals (Coordinators need technology to do this – they can work with communications group or use a tool like Google analytics) DESERT LCC; ARCTIC (this would give us “trends in data use” and they are happy to do it, if its required); GNLCC; WALCC; NWB LCC; SALCC; PFLCC
• Partner attendance in LCC related Forums (meetings, webinars, workshops, trainings, etc.) CALCC; GNLCC (but some LCCs cannot host more than a small number of people, such as the WALCC) NWB LCC (this works as a collective measure, but not as an individual one because of differences in population size). ABSI (not just Steering Committee, but other “partners” too – all participants in workshops and webinars)
• Progress on 1-‐2 priority processes or programs that are expected to be using LCC-‐derived information, as reported every 6 mos. by Steering Committee (this assumes that units have identified priorities, but not all have done that yet, so it would be a good metric for more “mature” LCC units) CALCC (This will help us incorporate an adaptive mgmt. approach to disseminating, synthesizing, and researching information to our users.); WALCC; NWB LCC
17
• LCC Map use GNLCC
• Evidence that the LCC priorities and products are being used to align partner activities WALCC
• Number of citations of LCC-‐sponsored products in published planning and strategic documents for member organizations and beyond.
• Number of citations of LCC-‐sponsored products in published conservation research (to assess the secondary impacts of LCC info) PICCC
Notes: (ARCTIC) There are privacy issues about asking people to identify themselves when they download products and information from LCC websites. If you just track downloads, then someone downloading the data twice would count the same as two separate people downloading the data. Also, when users repost the data on another site, it would be difficult to track. If there was a media mention of certain studies or data sources, there could be lots of downloads even if there wasn’t actually greater use of the information.
Outcome • Case studies with narrative about specific processes, planning, and projections that
have used LCC-‐derived materials, tools, data, or other products (or funding?) DESERT LCC; NWB LCC; ARCTIC; PICCC
• Actions attributable to LCC-‐related planning efforts (some LCCs don’t do much “planning”) -‐ CALCC (Plans include Forest Service District plans, Refuge Habitat Management Plans, State Wildlife Action Plans, etc. Action could include grants approved for land acquisition, adaptive resilience habitat management (ie, restoration – need to work on a new term for an outdated idea), land and water management, protection of species/habitat, etc.)
• Actions attributable to LCC contributions to the resource conservation and
management body of information. ARCTIC
• Survey of known management and decision making efforts that have considered LCC-‐sponsored products. PICCC
Bullet 2. LCCs are facilitating the development of shared, landscape-‐level, conservation objectives and coordinated adaptation strategies that are based on a shared scientific understanding about the landscape, including the implications of current and future environmental stressors
Output (number of something doesn’t indicate quality or progress – is this meaningful? WALCC; ARCTIC) ARCTIC comment – these output measures just don’t quite do it, yet no better measures come to mind either!
• Number of meetings and workshops (again, is this meaningful – more meetings doesn’t mean that something happens on the ground; ALSO more meetings is the
18
bane of existence for many – not a sign of positive performance) that define common objectives, targets, and strategies. CALCC; GNLCC; UMGL-‐LCC
• Number of programs that include LCC-‐developed conservation objectives and strategies as funding or program priorities (would need to ask for this information) CALCC; GNLCC
• Number of websites, Facebook pages, twitter feeds, etc. that mention LCC units CALCC
• Track LCC-‐funded projects (is number the appropriate metric here (yes)? Or progress over time? While not necessarily a good metric, it would be easy to track and potentially (?) informative) CALCC; UMGL-‐LCC; ARCTIC
• Number of shared landscape objectives approved by the Steering Committee GNLCC; SALCC
• Number of coordinated adaptation measures approved by the Steering Committee (none, some, or complete) (note this is not the role of the LCCs, but of member organizations, partners, and agencies) SALCC
• Number of plans, planning documents, webinars, and conference calls that coordinate conservation objectives and/or adaptation strategies UMGL-‐LCC; NWB LCC; PICCC; PFLCC
• Number or projects involving more than one LCC UMGL-‐LCC; NWB LCC; PFLCC
• Number of partnerships and processes that coordinate objectives and strategies with consideration of current and future stressors DESERT LCC
• LCC has undertaken a needs assessment that has identified and articulated shared priorities/goals/management objectives among participating entities and identified associated information gaps to fill in obtaining goals and management objectives. DESERT LCC
• LCC has developed applied science and decision support tools useful to resource managers in addressing pressing science and management concerns by leveraging federal and non-‐federal resources (part of DOI’s priority goal for climate change) allocated to projects or programs that incentivize planning for, and addressing impacts of, climate change. DESERT LCC
• Number of meetings and workshops that identify future scenarios and discuss adaptation strategies NWB LCC
• LCC Charters PFLCC
• LCC Working Groups PFLCC
19
• LCC Biological planning documents PFLCC
• LCC Conservation design documents PFLCC
• LCC developed conservation targets and surrogate species PFLCC
Outcome • Testimonials from partners/case studies demonstrating the Lcc’s contribution to
development of shared objectives and strategies. DESERT LCC; WALCC; PICCC
• Descriptions of how other federal goals, like DOI's priority goals on climate change, are being met through LCC-‐related work DESERT LCC; WALCC; ARCTIC
• Documentation that the LCC has initiated actions that have resulted in increased collaboration and coordinated partner activities (including and beyond those directly funded by the LCC). WALCC
• Number of planning documents by partner organizations that adopt objectives and strategies developed by the LCC UMGL-‐LCC
• Qualitative measure of the LCCs’ contribution to informing and developing conservation objectives and adaptation strategies, supported by examples and subject to peer review. ARCTIC
• Description of purpose-‐driven LCC sponsored member working groups and their outcomes. PICCC
Bullet 3. LCCs are facilitating the exchange of applied science that is used to inform the implementation of conservation actions, strategies and products developed by the Cooperative or their partners (exchange between researchers and managers, researchers and researchers, managers and managers, and others)
Output • Number of trainings on information sharing and tools -‐ CALCC (This could include
training on climate science information or climate science tools that have breakout groups who work on real problems.) UMGL-‐LCC; ABSI
• Number of presentations on LCC products and strategies CALCC; UMGL-‐LCC; NWB LCC (would want to track number of attendees at the presentations); ABSI; ARCTIC (with the caveat that this could cause information pollution)
• Amount of support provided for travel to meetings (this could be problematic as Congress could actually think that less is better, rather than more) CALCC
• Number of reports downloaded (does this mean # reports or # downloads?) from LCC websites like the California Commons CALCC
20
• Number of information exchange mechanisms on LCC websites ABSI; ARCTIC (with more information on what an information exchange mechanism is, exactly)
• Number of LCC-‐sponsored webinars, workshops, meetings, scientific presentations, papers, posters WALCC (but same problem with reporting on the number – it isn’t the number that is important, but the quality); UMGL-‐LCC; NWB LCC; ABSI; PFLCC
• Number of hits on listserves and websites CALCC; ABSI
• Number of participants in LCC-‐related webinars CALCC; ABSI
• Hits and unique visitors across all LCC sites that deliver science and data as well as sites that deliver LCC-‐related information, even if not hosted by the LCCs ABSI; ARCTIC; SALCC
• Number of presentations for partners or conferences (could be an artifact of travel restrictions at local, regional, and federal level) UMGL-‐LCC; NWB LCC; ABSI; PFLCC
• Development of conservation planning atlas or websites (UMGL-‐LCC metrics like this need to be a one-‐time metric for any LCC, since once an atlas is developed it continues to perform a function indefinitely and does not need to be replicated, nor is it appropriate to take credit for development for every successive performance period; also need to include web content, rather than discrete websites) ABSI; PFLCC
• Development of integrated adaptive science/management teams [that have demonstrated their ability to advance the applied science related to their topic] (some LCCs don’t participate in management, beyond planning) WALCC; ABSI; PFLCC
• Number of organizations/entities that the LCC is partnering with (funded projects, other value-‐added activities) ABSI
• Quantitative measure reflecting how well LCC partners view the LCC’s effectiveness in making available and reflecting the exchange of applied science that is used to inform conservation (e.g. on a scale of 1-‐10…) ARCTIC
• Documented cases of support of demonstration/test sites where managers can share and learn from own and collective efforts. PICCC
• Categorization of LCC sponsored efforts into dissemination categories: academic dissemination only; regional management-‐oriented seminars and workshops; direct tailoring of output for management needs; development of decision tools; etc. PICCC
• Number of ‘translational science’ LCC products that attempt to facilitate management inclusion of science in planning and decision-‐making. PICCC
Outcome • Anecdotes about exchange of information -‐ (Arctic) -‐ It would be easier, and
perhaps more meaningful for us to describe the different ways in which we are creating an environment that fosters information exchange. For example, we can
21
describe info exchange mechanisms built into our web sites, tally our webinars, workshops, meetings, scientific presentations on which we serve, scientific papers and posters that we present, etc. We can also provide noteworthy anecdotal examples of when we have facilitated the exchange of information; NWB LCC
• Coordination with Climate Science Centers, USGS Ecological Centers, and others (what about SIAS, DOI Priority Goals?) WALCC; NWB LCC; ABSI; ARCTIC; PFLCC
• Case study examples and stories about how information exchange, facilitated by the LCCs, led to the implementation of conservation strategies DESERT LCC; WALCC; NWB LCC; PICCC
• Documentation that new groups are utilizing the information and participating in
activities based on queries to the LCCs, anecdotal information about product use, new involvement in meetings, etc. WALCC
• Qualitative measure reflecting how well an LCC in making available, by whatever means appropriate, the applied science that is used to inform the implementation of conservation strategies and products. ARCTIC
• Case studies of LCC efforts to bring together researchers and managers to tackle specific natural resource issues to broadly assess role of LCC as boundary organization (e.g., sponsoring structured decision making workshop, etc.) PICCC
Note – (Arctic) It would be too time intensive and difficult to interview all partners, hold meetings and phones calls, and take other measures specifically to track the number of times that information exchange was facilitated. This measure should be built into the course of business rather than be collected separately. Notes – (UMGL-‐LCC) testimonials seem anecdotal and more suitable for a communications strategy than for performance measurement
Bullet 4. LCCs are employing adaptive management concepts to determine to what extent we are successfully addressing the shared conservation objectives of our partners
Output • Track performance reporting to see how well its capturing contributions (this needs
more clarification – what is being tracked? Is it an output or an outcome?) CALCC
• Follow up Bullet 1 with additional information over time, to include adaptive management CALCC
• Conduct needs assessment using an interview approach (this needs more clarification – what is being tracked? Is it an output or an outcome?) – (Arctic) They conducted an "assessment of future needs for arctic land managers relative to climate change”, which was an extensive series of 30 or so lengthy interviews with many of their key stakeholders. The purpose of these interviews was to make sure
22
we were delivering products that were useful to our partners, and to determine what products they needed that we were not delivering or planning on delivering.
• Document that LCC products are directly addressing decision maker needs through
queries to decision makers and/or utilizing their input in project selection. WALCC
• Number of stakeholder objectives or priority information needs addressed by LCC activities NWB LCC
• Proportion (rather than number) of shared conservation objectives for which some
action was taken by the LCC, AND for which that action was objectively evaluated for effectiveness. ARCTIC
Outcome • Testing, refinement, and updating of previously approved (prioritized?) conservation
objectives and adaptation strategies SALCC
• Track the extent to which the LCC is addressing the shared conservation objective DESERT LCC
• Track the extent LCC information has been used by partners for including climate change knowledge into their decision-‐making (tying into the DOI’s strategy for climate change). DESERT LCC
• Testimonials from partners/case studies that describe adaptive management efforts and implementation with role of LCC demonstrated DESERT LCC
• Narratives from conservation partners describing how LCC activities have informed
the testing, refinement, updating, and implementation of stakeholder adaptation strategies NWB LCC
• Qualitative assessment of how the LCC is appropriately altering or maintaining its
approach to addressing shared conservation objectives based on: lessons learned, evaluation of past actions, and new information obtained regarding shared conservation objectives. ARCTIC
• Periodically compare past LCC projects stated expected goals and impacts with
actual delivered products, and most importantly-‐ perceived impacts on regional conservation practice 2-‐3 years after project completion PICCC
Notes: (GNLCC) Seems like its about an operational model so this is not as critical to measure but perhaps becomes part of the process for evaluation.
Bullet 5. The LCCs are actively linked to ensure an effective network
Output
23
• Number of multi-‐LCC (and LCT) conference calls, meetings, decisions up for voting, etc UMGL-‐LCC; PFLCC
• Number of joint projects CALCC; GNLCC; ARCTIC; PFLCC
• Number of projects generated by similar projects from other LCCs CALCC; GNLCC; UMGL-‐LCC
• Count of adjacent LCCs with cross-‐boundary compatibility of adaptation strategies GNLCC; SALCC
• LCC science projects with more than 1 LCC GNLCC; UMGL-‐LCC; ARCTIC; PFLCC
• Multi-‐LCC coordination UMGL-‐LCC; NWB LCC (but how is #1 different?); ARCTIC; PFLCC
• National LCC working groups ARCTIC; PFLCC
• National LCC Charter (its existence is progress?) ARCTIC; PFLCC
• National performance metrics (this is what we are doing here – need to show progress) PFLCC
• Number of activities the LCC staff is participating in that are network or multi-‐LCC focused. WALCC
• Proportion of funds allocated towards, and proportion of staff time spent engaged
in, multi-‐LCC and National LCC Network endeavors. ARCTIC
• Collaboration to efforts beyond the LCC network and related footprint. This is particularly important to the viability of LCCs that are on the geographical edge of the network, where ties beyond the LCC footprint may be critical PICCC
Outcome • Change over time of social network analysis toward desired state of connectedness
DESERT LCC
• Survey of individuals (but are people willing to take surveys? Are they too tired of them?) within LCC about connections across LCC boundaries, by third party SALCC
• Descriptions of partnerships among LCCs and cross-‐boundary projects DESERT LCC
• LCC has involvement by multiple sectors and organizations and address multiple resource issues, measured by resources leveraged by LCC partners (funding and in-‐kind contributions) and/or projects that cross LCC boundaries. DESERT LCC
• LCC assists partners with internal agency or organization communication (within an individual agency/organization as opposed to among multiple partners) about climate change and other priority topics identified by the LCC. DESERT LCC
24
Notes: (GNLCC) I would perhaps categorize this as networking effectiveness among societal, political, management and science networks; As for outcomes, It may make sense to have ‘short duration’ outcomes such as some of those you’ve defined (with more substance and content quantification) and ‘long duration’ outcomes (or other appropriate language) which get to biological and ecological outcome measures related to ecological integrity and process, and resource measures (habitat, species or cultural). Developing broad-‐based ecological and resource outcomes would allow us to connect our work to more on the ground landscape outcomes which I believe is part of what congress is looking for from LCCs (at least within the context of how FWS views this program). There are clearly some tricky parts of this for the network but I think its do-‐able, particularly as we get closer to some network wide adoption of what we mean by ‘ecologically connected landscapes’ and other nationally relevant efforts (NFWPCAS). We may need to define these loosely to start and get more specific as our network and LCCs mature. In my intent, ‘short duration’ means something we can measure from year to year and ‘long duration’ would require more years of data on resources and landscape measures. I would think we want to continue to track short interval measures even as we define, refine and adopt ecological or cultural resource outcomes. SOCIAL NETWORK ANALYSIS The LCC’s aspire to be “knowledge brokers” and catalysts that ensure science, particularly climate science, is used in large-‐scale, multi-‐partner conservation planning and action. One promising way to measure this impact is Social Network Analysis (SNA), which is the mapping and measuring of relationships and flows between people and/or organizations. These SNA have already been used successfully to evaluate other Federal and state natural resource agencies, such as the Fire Learning Network (see: Bruce Goldstein’s work at: http://conservationlearningnetworks.weebly.com/fire-‐learning-‐networks.html) and the Oregon Department of Natural Resources (contact Ken Vance-‐Borland: ken.vance-‐[email protected]). These networks also establish social norms and communities of learning for large scale conservation endeavors and can show not only how actors link together through flows of knowledge and resources, but also how these factors relate to ecological and social features of interest, see Figure below from Guerrero and colleagues (2013).
25
Because of the network distribution of the LCCs, using SNA to assess linkages and communication could be relatively simple and effective. It has the additional benefit of engaging multiple stakeholders in the process of data collection and serving as an outreach tool to promote the learning and collaborating aspects of networks such as the LCCs strive to be. Systems for scoring network linkages have been developed and could be implemented to assess and help guide where to focus efforts. REFERENCES Black, S. and J. Groombridge. 2010. Use of a business excellence model to improve conservation programs. Conservation Biology 24:1448-‐1458.
CMP. 2009. Open Standards for the Practice of Conservation. Conservation Measures Partnership, Bethesda, Maryland.
Guerrero, A.M., R. R. J. McAllister, J. Corcoran, J. K. A. Wilson. 2013. Scale Mismatches, Conservation Planning, and the Value of Social-‐Network Analyses. Conservation Biology, 27(1):35–44 LCC Charter 2012.
Manno, B. V., S. Crittenden, M. Arkin, and B. C. Hassel. 2007. A Road to Results: A Performance Measurement Guidebook for the Annie E. Casey Foundation’s Education Program. Annie E. Casey Foundation, Baltimore, Maryland.
Parks Canada. 2009. EI Monitoring and Reporting Program. Presentation at the Environmental Evaluators Network meeting, 21 September, Ottawa, Canada.
USFS. 2007. LMP Monitoring and Evaluation: A Monitoring Framework to Support Land Management Planning. US Department of Agriculture, US Forest Service, Washington, DC.
26
WWF Standards of Conservation Project and Programme Management (PPMS): Version 19 October 2012. White paper available at http://awsassets.panda.org/downloads/0_0_wwf_standards_overview_2012_10_19.pdf