+ All Categories
Home > Documents > Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. ·...

Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. ·...

Date post: 05-Oct-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
17
http://evi.sagepub.com/ Evaluation http://evi.sagepub.com/content/16/2/137 The online version of this article can be found at: DOI: 10.1177/1356389009360478 2010 16: 137 Evaluation Larouche, Jean-Marc Jalhay, André-Pierre Contandriopoulos and Jean-Louis Denis Raynald Pineault, Paul Lamarche, Marie-Dominique Beaulieu, Jeannie Haggerty, Danielle Decision-and Policy-Making: An Illustrative Case in Primary Healthcare Conceptual and Methodological Challenges in Producing Research Syntheses for Published by: http://www.sagepublications.com On behalf of: The Tavistock Institute can be found at: Evaluation Additional services and information for http://evi.sagepub.com/cgi/alerts Email Alerts: http://evi.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://evi.sagepub.com/content/16/2/137.refs.html Citations: What is This? - Apr 16, 2010 Version of Record >> at ENAP. BIBLIOTHÈQUES. on December 18, 2012 evi.sagepub.com Downloaded from
Transcript
Page 1: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

http://evi.sagepub.com/Evaluation

http://evi.sagepub.com/content/16/2/137The online version of this article can be found at:

 DOI: 10.1177/1356389009360478

2010 16: 137EvaluationLarouche, Jean-Marc Jalhay, André-Pierre Contandriopoulos and Jean-Louis Denis

Raynald Pineault, Paul Lamarche, Marie-Dominique Beaulieu, Jeannie Haggerty, DanielleDecision-and Policy-Making: An Illustrative Case in Primary Healthcare

Conceptual and Methodological Challenges in Producing Research Syntheses for  

Published by:

http://www.sagepublications.com

On behalf of: 

  The Tavistock Institute

can be found at:EvaluationAdditional services and information for    

  http://evi.sagepub.com/cgi/alertsEmail Alerts:

 

http://evi.sagepub.com/subscriptionsSubscriptions:  

http://www.sagepub.com/journalsReprints.navReprints:  

http://www.sagepub.com/journalsPermissions.navPermissions:  

http://evi.sagepub.com/content/16/2/137.refs.htmlCitations:  

What is This? 

- Apr 16, 2010Version of Record >>

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 2: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Article

Evaluation16(2) 137–152

© The Author(s) 2010Reprints and permission: sagepub.

co.uk/journalsPermissions.navDOI: 10.1177/1356389009360478

http://evi.sagepub.com

Corresponding author:Raynald Pineault, Direction de santé publique de Montréal, 1301 rue Sherbrooke Est, Montréal Qc, H2L 1M3, Canada Email: [email protected]

Conceptual and Methodological Challenges in Producing Research Syntheses for Decision- and Policy-Making: An Illustrative Case in Primary Healthcare

Raynald Pineault, Paul Lamarche, Marie-Dominique BeaulieuUniversity of Montreal

Jeannie Haggerty

University of Sherbrooke

Danielle Larouche, Jean-Marc Jalhay, André-Pierre Contandriopoulos, Jean-Louis Denis University of Montreal

AbstractThis article presents and discusses five challenges encountered in conducting a knowledge synthesis on primary healthcare, commissioned by the Canadian Health Services Research Foundation. These challenges are (1) conceptualizing, defining and operationalizing complex interventions; (2) integrating quantitative and qualitative studies and assessing strength of evidence; (3) incorporating expert opinions and decision-makers’ viewpoints; (4) producing timely results; and (5) presenting the results in a concise yet understandable form. We also propose methods and operational tools to deal with these issues, particularly regarding integration of qualitative and quantitative evidence and incorporation of expert opinions into syntheses. The major challenge of the synthesis was to provide pertinent and useful information for decision- and policy-makers, while maintaining an acceptable level of scientific rigour. This approach seems promising for knowledge syntheses, which sustain a deliberative process that leads to more enlightened decision and policy-making.

Keywords knowledge translation; primary healthcare; research synthesis

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 3: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

138 Evaluation 16(2)

Introduction

The need to produce knowledge syntheses in healthcare has been emphasized in recent years, particu-larly in the context of knowledge exchange/translation between researchers and decision-makers (Klein, 2000; Sheldon, 2005; Tranfield et al., 2003). Applying the evidence-based medicine (EBM) model to healthcare management and policy-making raises enthusiastic but also cautionary reac-tions (Black, 2001; Cookson, 2005; Klein, 2000; Victoria et al., 2004). The EBM model has trad-itionally been associated with quantitative approaches to syntheses. While the quantitative approach remains dominant, it is increasingly common to include qualitative research in evidence syntheses to take into account the complexity of organizations and the contexts in which they operate (Dixon-Woods et al., 2001, 2005; Mays et al., 2005; Pawson et al., 2005). While several attempts have been made to have more diversified forms of syntheses, the integration of both qualitative and quantita-tive evidence into a single synthesis has been limited (Oliver et al., 2005; Reed et al., 2005). In addition, the participation of decision-makers in synthesis production has been advocated as a necessary means for increasing the potential use of research findings (Lavis et al., 2005; Sheldon, 2005). But this strategy has proven to have limited application (Pineault et al., 2007).

Producing knowledge syntheses that integrate such elements and meet the expectations and needs of decision-makers presents many conceptual and methodological challenges (Reed et al., 2005). This paper addresses some of these challenges, using as an illustrative example a policy synthesis on primary healthcare (PHC) organization commissioned in 2002 by the Canadian Health Services Research Foundation (CHSRF) (Lamarche et al., 2003a). More specifically, we were asked to identify various models for organizing PHC and to determine their relative impact on utilization, responsiveness, quality of care, health outcomes and costs.

This paper presents the main conceptual and methodological challenges encountered in produ-cing the synthesis at different steps of the process, along with the solutions envisaged. These steps were 1) conceptualizing, defining and operationalizing complex interventions; 2) integrating quan-titative and qualitative studies and assessing the strength of evidence; 3) incorporating expert opin-ions with empirical findings and taking into account decision-makers’ viewpoints; 4) producing timely results; and 5) presenting the results in a concise yet understandable form.

Conceptualizing, defining and operationalizing complex interventionsEarly on in the synthesis process, we recognized that PHC represented a constellation of attributes, and thus a complex intervention, and that the studies concerned with PHC were diverse in terms of conceptual approaches and methods.

The first task we undertook was to conceptualize and operationalize the definition of PHC organizations. Given the complexity of PHC, we opted for a configurational approach that views organization not simply as a juxtaposition of individual attributes, but rather as a ‘multiple constel-lation of conceptually distinct characteristics that commonly occur together’ (Meyer et al., 1993). According to Meyer et al. (1993), ‘configurations may be represented in typologies developed conceptually or captured in taxonomies derived empirically’. Typologies are associated with deductive reasoning, whereas taxonomies are associated with inductive thinking. We chose a mixed strategy that combines the theoretical perspective of a typology with the empirical approach of a taxonomy. The final product was thus a theoretically grounded and empirically derived taxonomy. First, we adopted the theoretical perspective, proposed by several authors to evaluate organization

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 4: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Pineault et al.: Challenges in Producing Research Synthesis 139

and system performance. According to this perspective, PHC can be viewed as an organized system for collective action, which includes four elements interacting with each other in a dynamic and coherent way to produce services (Champagne et al., 2004; Contandriopoulos et al., 2000; Friedberg, 1993). These elements are vision, resources, structure and practices. We operationalized them by focusing on selected aspects of these concepts (Champagne et al., 2004; Contandriopoulos et al., 2000; Lamarche et al., 2003b).

Vision refers to the responsibility assigned to PHC organization and to values that guide action. The responsibility of PHC concerns the extent to which it meets healthcare needs of individuals and populations, and the scope of services ranging from medical to social. Values held by PHC organizations also involve the view of health as a privilege or as a right and their social mission. Resources refer to both the quantity and diversity of human, physical and financial resources, as well as information, diagnostic and therapeutic technologies. We gave particular attention to the presence of professionals other than MDs who can facilitate a multidisciplinary approach to care. Structure comprises administrative governance, rules, incentives, methods of payment and mech-anisms for integrating services. Finally, practice involves processes and mechanisms underlying the production of services.

This conceptual framework was particularly useful to establish categories of variables for which information was to be sought. From these four dimensions, we derived three variables. Hence, the theoretical approach was coupled with an empirical one that consisted of four steps: case selection, case description and synthesis, grouping into PHC models, and identification of the distinctive characteristics of these models. Details of these procedures were presented in our report to CHSRF (Lamarche et al., 2003b). In summary, 28 cases were selected to represent a wide variety of organ-izational models deemed relevant to the Canadian context. They were chosen to represent innova-tive forms of organizations, as 9 were implemented, 5 were experimented and 14 were at the proposal stage. They also varied in origin: 16 from Canada, 1 from the United States, 8 from European countries, 2 from Australia/New Zealand and 1 from an international organization. While not exhaustive, the cases selected represented a wide range of organizational modalities and contexts. We do not claim the cases selected to be exhaustive or fair representations of the various forms of PHC organizations currently found in industrialized countries. Rather, we tried to ensure that our selection of cases reflected the emerging models for organizing PHC and a wide range of options being currently debated for direction in our countries. Information on the organizational elements were abstracted from various sources, articles, reports and public documents.

4 PHC Models

2 Community models

2 Professional models

Integratedcommunity

Non integrated community

Professional coordination

Professional contact

Figure 1. A taxonomy of PHC models

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 5: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

140 Evaluation 16(2)

From 30 variables related to the 28 cases, we derived a taxonomy of organizational models using a statistical programme that combined factor and cluster analyses, with the former reducing the number of variables and the latter grouping the organization units (Escofier and Pages, 1998; Lambert et al., 1996; Lebart et al., 2000). We applied the criteria proposed by the author of the statistical programme, and we decided to stop the partition process of the classification technique when we reached four categories or models (Lambert et al., 1996). One of these criteria is the ratio of variance between groups to total variance, which indicates that the optimal number of categories has been reached and beyond which there is no gain in additional partitioning. Another, more con-ceptual criterion is that further partitioning does not enrich our understanding of the distinctive characteristics of the categories. In general, since the purpose of the classification technique is to reduce the number of categories, it is advantageous to stick to as low a number of categories as possible. We thus ended with four models: two community and two professional.

The two community models were further differentiated into integrated and non-integrated, and the professional models into coordination and contact models (Figure 1). In summary, community models assume global responsibility for the health of a population through a wide range of services and are publicly governed. Professional model organizations, which are physician-dominated, view their role as responding to the demands of individual clients through medical services; these organizations are often supported by nurses and are under professional governance. What further differentiates these community and professional models is their degree of integration/coordination with the healthcare system and their ability to function in integrated services networks, with the more integrated/coordinated being the community integrated and the professional coordination models. The professional contact model is a predominant form of PHC organization in Canada and the United States. It ensures accessibility to care for clients presenting with acute care problems but is less adapted to cope with chronic diseases.

Integrating quantitative and qualitative studies and assessing strength of evidenceThe second challenge was to integrate qualitative and quantitative information when linking the models to outcome or effect indicators, and assessing the methodological quality of the studies. Again, details of the methods followed are presented more fully in the report to CHSRF (Lamarche et al., 2003b). The indicators covered the following aspects of care:

• effectiveness (health outcomes);• accessibility, mainly expressed by time to access and equity of access, that is, the extent to

which access to care is ensured to meet patient needs, regardless of the socioeconomic status of the individual;

• continuity and its three dimensions: informational, relational, and management (Haggerty et al., 2003);

• quality, in terms of appropriateness of care in accordance with clinical guidelines;• productivity, in terms of utilization and cost and, more particularly, substitution effect

between primary care and other levels of care• responsiveness, that is, the degree to which individuals’ expectations and preferences are

taken into account.

To link the outcome indicators to the four taxonomy models, we conducted a literature search to identify articles and reports that met the following criteria: they had been published between 1995

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 6: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Pineault et al.: Challenges in Producing Research Synthesis 141

and 2002; they compared at least two models of the taxonomy; and they focused on our outcome indicators. From an original list of 177 publications, only 38 were retained by the reviewers who applied these criteria more parsimoniously after reading each paper. None of the 38 publications was comprised in the material used to document the 28 cases that served to construct the taxonomy.

The next step was to summarize the articles, using the categories of the conceptual model to classify the information on organizational models and on outcome indicators. The papers were then mapped into the taxonomy using the professional contact model as the reference against which the three other models were contrasted. In accordance with the configurational approach, effects were imputed to the model, not to individual model attributes. The magnitude of the effects and their attribution to a given model were determined by two reviewers independently. In case of disagree-ment between reviewers, either on attribution to a model or on assessment of outcomes, a panel of three reviewers was to make a final decision. Since disagreement never occurred, this procedure was not required.

Next came the major task of assessing the strength of evidence linking the models to the effect indicators (internal validity/credibility), and the generalizability of the results (external validity/transferability). The main difficulty at this stage was to ascertain the quality of both qualitative and quantitative studies, and to use a common standard to judge both the internal and external validity of quantitative and qualitative studies by appropriate criteria (Dixon-Woods et al., 2005; Mays et al., 2005; Oliver et al., 2005). Table 1 shows the criteria that we used. Quantitative research criteria are better known since they are derived from the EBM approach. For example, with respect to design, the highest score was given to randomized clinical trials, then to multiple-time series and so on, as suggested by widely accepted guidelines. Sample size was important for statistical power to reveal a difference between models. Size of the study population (e.g. many countries, one region) was important to external validity. The nature of the sample to enable statistical inference was another criterion. Finally, a multi-site study had greater external validity than a single-site one.

Qualitative research criteria were more complicated. As mentioned earlier, most, if not all, of the studies were quantitative, so that a qualitative analysis was complementary to a quantitative one and supported it in a subsidiary role, according to what has been described as the enhance-ment model (Dixon-Woods et al., 2004). Two types of analyses guided the search for qualitative elements. Analysis of the logic of intervention was used to determine the theoretical plausibility of relationships between models and effect indicators (internal validity/credibility). Theoretical inference could then be made on the likelihood that similar results would be found if an interven-tion based on a similar theory was applied (external validity/transferability). For analysis of the logic of intervention, we looked for elements either in the introduction generally accompanying the theoretical framework and the hypotheses (deductive approach), or in the discussion (inductive, emerging approach).

Table 1. Assessing internal (credibility) and external validity (transferability)

Internal validity/credibility External validity/transferability

Quantitative criteria Design Sample size (statistical power)

Statistical inference Study population Number of sites

Qualitative criteria Logic of intervention Implementation analysis

Theoretical plausibility Synergy or antagonism of context with intervention

Theoretical inference Reproducibility of contextual conditions

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 7: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

142 Evaluation 16(2)

We also used an implementation analysis. This type of analysis identified contextual factors and implementation processes that could interact with or influence the models on effect indicators (internal validity/credibility). By identifying such factors, an implementation analysis makes it possible to determine the extent to which results would be reproducible under similar contextual and implementation conditions (external validity/transferability). In addition to these considera-tions, analysis of the logic of intervention and implementation analysis made it possible to general-ize on a basis other than time, that is, either on a theory application basis or a contextual conditions reproducibility basis, thus rendering the synthesis results less time-sensitive.

As for the assessment of the effects and their attribution to the models, each article was reviewed by two investigators from our team. They assessed both internal validity/credibility and external validity/transferability using quantitative and qualitative research criteria. The grid investigators used is shown in Figure 2. The two reviewers then compared their scores and attempted to reach a consensus. For this operation, two teams of two reviewers were formed. If consensus could not be reached in a team, there was the option of having the four reviewers meet to reach consensus (unanimous or majority). This procedure did not prove necessary.

Quantitative factors

Internal validity External validity

Level 1A

RCT, multiple chronological series, pre-post

cohort, comparison groupLevel 1

A Study population: Large population

B Sample size B Random sample or entire population

Level 2

A Other quasi-experimental comparison group C > 5 multi-sites

BSample with average size, statistical power:

variableLevel 2

AStudy population: Average or small

population and/or selection

Level 3A Non-experimental approaches B Non-random sample

B Small sample C Few sites < 5, > 1

Level 3A Study population

C Single site

Qualitative factors Qualitative factors

Internal validity External validity

Level 1

AIntervention logic: theoretical framework,

assumptions or anchored theory

Level 1

A Experimental conditions

B

Context: context factors linked to the

interventions criticized in relation (explanation) to

the impacts (implementation analysis)

B Theoretical inference

Level 2

AIntervention logic: reference to theoretical factors

and high plausibilityLevel 2

A Experimental conditions

BContext: contextual factors raised but without

implementation analysisB Theoretical inference

Level 3A

Intervention logic: little or no reference to

intervention theory or logic Level 3A Experimental conditions

B Context: little or no reference B Theoretical inference

Reference: Reader(s):

Quantitative factors

Figure 2. Form used for assessing internal and external validity of empirical studies

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 8: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Pineault et al.: Challenges in Producing Research Synthesis 143

Ultimately, the aim of this analysis was to produce an overall judgement on each set of outcome indicators for each PHC model. We applied four criteria to derive a final score:

• the number of observations showing a given effect;• the direction (+ or -) and magnitude of effect;• the convergence of observations;• the scores of internal and external validity.

Figure 3 illustrates how these criteria were applied using the example of continuity. For relational continuity, 15 studies revealed that community models achieved more continuity than professional ones: 5 showed less continuity and 9 showed no difference. Average scores were 2.3 for internal validity and 2.8 for external validity on a scale ranging from 1 (highest) to 4 (lowest); convergence of empirical observations in relational continuity was considered to be +/=, given the number of studies with +, =, and -. Overall judgment was a +, on a scale ranging from -- to ++. The same procedure was applied to every outcome indicator, and empirical data were aggregated to produce a final judgement score.

Incorporating expert opinions and decision-makers’ viewpointsThe commissioning agency required that we consider expert opinions and decision-makers’ viewpoints in our analysis. Two working sessions were held with representatives of funding agencies and partners during the course of the project, as well as two videoconferences with researchers across Canada. In addition, written comments and suggestions were solicited from three senior scientific advisers, two from Canada and one from Australia. These individuals played an important interpretive role at different stages of the synthesis. As Sheldon (2005) puts it, ‘It is easy to talk about how important it is to involve decision-makers in the production of reviews, but it is less clear how this could be done in any meaningful way other than by including them in advisory or reference groups’. This issue was resolved by conducting a Delphi survey among decision-makers, academics and clinicians involved in PHC as a complement to the analysis of empirical data. Details of this survey are available in the report (Lamarche et al., 2003b).

ContinuityIV EV No. Conv. Strength JDGMT

Relationalcontinuity

15 + 2.3 2.9 Lg. Yes +/= Low +9 = 2.2 3.2

3 − 4 2

Caremanagement

5 + 2.9 3.1 Avg. Yes + Low +

Effect measured

Aspect ofimpact

Selected bibliographicreferences

Meaning and number of observations• Sign always refers to the indicator• Example presented: Comparison of community model with professional model

Synthesis of informationand judgment

Studiesconsidered

Integrated communitymodel/Non-integrated

community model

Prof. coordinatedmodel/Prof. contact

model

Internal and external validity scores (From 1 – strongest −

to 4 – weakest)

Figure 3. Application of the four criteria to assess continuity

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 9: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

144 Evaluation 16(2)

Participants in the Delphi survey were first given a complete description of the four models. They were then asked to assess the expected impact on the effect indicators of implementing 3 PHC models (the two community models and the professional coordination model) relative to the professional contact model that serves as the reference. An analysis performed on three groups of respondents – academics, researchers and decision-makers – showed a high degree of agreement on all responses among the three groups. Their scores were thus aggregated to reflect expert opinions, which were then combined with empirical data obtained from the synthesis to produce a final judgement on the relationship between the four PHC models and different out-come indicators (Figure 4).

Empirical and Delphi findings were integrated using an operational framework (Figure 5). When data converged, they confirmed or accentuated the results provided by empirical data; when they diverged, expert opinions might have had an effect of mitigation, differentiation of models or assessment of their potential. Although empirical data played a determinant role in measuring effects, the final results integrating the two sources of data were greatly influenced by the Delphi data, which were decisive in certain cases where empirical evidence was weaker.

By way of illustration, Table 2 shows the result of applying the framework to continuity. The most striking result is probably the difference between the two sources concerning the assessment of the non-integrated community model. Since there was a lack of clear empirical evidence comparing integrated and non-integrated community models, the expert opinion was decisive in differentiating these two models. The other result that stands out concerns the professional coordination model, which is more highly rated by Delphi than by empirical data. This probably reflects the great potential experts saw in this model compared to the professional contact model.

In sum, while partners and representatives of funding agencies played an important interpretive role in the synthesis, using Delphi enabled us to complement empirical data by integrating expert opinions, including decision-makers’ viewpoints.

Producing Timely ResultsThe convergence analysis done through the use of the operational framework increased the validity (both internal and external) of the results. Recourse to a Delphi survey also increased the timeliness of the findings by projecting them into the future (Figure 6). The strategy of triangulating data from different points of reference in a time horizon constitutes an innovative and effective way of ensuring timeliness of the syntheses findings.

Empirical dataAssessing strength of

evidenceEmpiricalevidence

Expertopinion

DELPHI

Convergenceanalysis

Figure 4. Combining empirical data and DELPHI findings to assess evidence

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 10: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Pineault et al.: Challenges in Producing Research Synthesis 145

Presenting the results

Last but not least, the challenge in producing syntheses for decision-makers is to be able to show the results in a concise yet understandable form. One way of accomplishing this is to give visual form to the data. Figure 7 shows the results of the synthesis after aggregating qualitative and quantitative data and combining Delphi and empirical data. On this Kiviat (1991) diagram, the dotted lines represent an optimal situation arbitrarily located one point above the best observed results (two in the case of access, which was cited as a major concern in Canada). The models are positioned on the figure according to their rank on the ordinal scale (1 to 4). The community models, and particularly the integrated one, appear more performing than the professional models on most indicators except accessibility and responsiveness, where professional models are more performing. For decision-makers, such visual representation provides a quick overview of the results and enables them to anticipate the effects any proposed change of PHC model can have on selected outcomes. On the other hand, if they

ConfirmationAccentuation

Expertopinion

MitigationDifferentiation

Assessment of potential

Convergence

Divergence

Empiricalresults

Figure 5. Operational framework for integrating empirical data and expert opinion

Table 2. Integration of empirical data and expert opinions: continuity

Professional model Community model

Contact Coordination Non-integrated Integrated

ED EO ED EO ED EO ED EO

Continuity = = - ++ ++ = ++ ++Relational = = - ++ + + + ++Integrated management = = - ++ + = ++ ++Validity (IE) Reference (-,+) (-,+) (-,+)Rank (Score) 3(=) 4(=-) 2(+) 1(++)Influence of expert opinions Confirmation Potential Mitigation

DifferentiationAccentuation

ED = empirical data; EO = expert opinion.

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 11: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

146 Evaluation 16(2)

want to achieve certain objectives with respect to selected indicators, visual representations can indicate what choice of models will be more conducive to attaining such objectives.

DiscussionProducing syntheses for decision and policy-making in healthcare organizations entails many con-ceptual and methodological challenges at different steps of the process (Reed et al., 2005). We have illustrated some of the challenges we encountered, along with the solutions envisaged, in conducting a commissioned synthesis on primary healthcare organization.

Time horizon

Past Present Future

Systematicreviews

Researchcollective

DELPHI

Meta-synthesis

Figure 6. Timeliness of research synthesis

0

1

2

3

4

5

6

7

8

Effectiveness

Continuity

Quality

Cost reduction

Total utilization reduction(Substitution)

Responsiveness

Access

Equity of access

Professional contact model Professional coordination model

Non-integrated comm.model Integrated comm.model

Optimal performance (Observed and/or expected)

Figure 7. Outcome indicators for the four models

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 12: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Pineault et al.: Challenges in Producing Research Synthesis 147

During the initial stage, the major challenge was to define the complex intervention variable that is primary healthcare organization. We defined intervention in a very broad sense, that is, as any set of actions producing outcomes, be it a practice, an organization, a programme or a system (Champagne et al., 2004; Reed et al., 2005). Traditional forms of EBM syntheses have dealt with somewhat simple interventions such as medical and surgical treatments (Pawson et al., 2005). In addition, medical interventions tend to be context-independent, that is, they do not interact markedly with different characteristics of the context in which they are implemented nor do they vary significantly from one context to another (Lomas et al., 2005). Conversely, interventions such as health programmes, and especially organizations, are more complex and rely for their coherence upon theories that link the components together (Pawson, 2002). Moreover, the interventions are context-linked: they are indis-sociable from the contexts in which they are implemented and they interact with these contexts to influence their effects (Lomas et al., 2005; Oliver et al., 2005; Pawson, 2002; Pawson et al., 2005).

It then becomes difficult to isolate the specific effects of various components of a complex intervention, such as that proposed by the contingency approach (Meyer et al., 1993; Miller et al., 1998; Plsek and Greenhalgh, 2001). Complexity is better captured by the configurational approach (Fiss, 2007; Meyer et al., 1993). Moreover, in the real world, an organization forms an indivisible whole and a decision-maker cannot act on one variable at a time, holding all others constant. The configurational approach would thus seem appropriate for knowledge syntheses in which intervention variables are modes of organization.

The second challenge was related to the quantitative/qualitative debate going on in evaluative research (Dixon-Woods et al., 2001, 2005; Mays et al., 2005; Oliver et al., 2005). According to the so-called relativists, quantitative and qualitative studies cannot be mixed in a synthesis (Dixon-Woods et al., 2004; Mays et al., 2005). It must be either qualitative or quantitative but not both. In other words, each type of research plays an independent and distinctive role in synthesis, based on a ‘difference model’ (Dixon-Woods et al., 2004). If the approach is quantitative, the synthesis tends to be integrative, whereas the qualitative approach tends to be more interpretative, but this is not an absolute rule (Dixon-Woods et al., 2004). Another view, shared by the pragmatists, or ‘realists’, contends that both qualitative and quantitative studies can be mixed in a synthesis (Mays et al., 2005). They perceive one type of research (generally but not necessarily qualitative) as playing a complementary and subsidiary role in syntheses in relation to the other type (generally but not necessarily quantitative). In the foreword to the Mary Dixon-Woods report, Kelly justifies this pragmatic viewpoint: ‘Certainly there are epistemological and practical differences between research traditions, but this in itself is not a reason for inaction’ (Dixon-Woods et al., 2004).

A third challenge was integrating expert opinions and decision-makers’ viewpoints. According to the definition proposed by CHSRF, a knowledge synthesis is an evaluation or analysis of research evidence and expert opinion on a specific topic to help decision- and policy-making in the context of knowledge exchange activities between researchers and decision-makers (CHSRF, 2010). The notion of expert inclusion is thus essential in knowledge synthesis and can be done in different ways, ranging from informal advice-seeking to the use of more formal tools such as nominal and focus groups or Delphi surveys (Lomas et al., 2005). Scientific evidence can be viewed either as context-free, following an epidemiological or EBM tradition, or context-sensitive, following a social sciences and management/organization tradition (Lomas et al., 2005). Expert opinion can add to both types of evidence, depending on the experts invited to participate in the process (researchers, academics, managers, policy-makers). In our synthesis, we observed a high degree of convergence among these three groups of respondents.

The distinction made between scientific and colloquial evidence can help identify the role and contribution of decision-makers in the process of producing syntheses (Lomas et al., 2005). As

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 13: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

148 Evaluation 16(2)

noted earlier, scientific evidence is not restricted to context-free determination of effectiveness but is also concerned with contextual factors through appropriate methods derived from social sciences. Colloquial evidence is of a different nature and, in Klein’s terms, addresses questions of organiza-tional and political evidence (Klein, 2000, 2003). It is at this point that the opinions and views of managers and policy-makers come into play. The extent to which colloquial evidence can complement scientific evidence ensures that decisions and policies which rest on solid scientific bases remain feasible as well as acceptable (Klein, 2000).

The integration of decision-makers’ viewpoints can be achieved by different means, ranging from formal mechanisms (nominal, focus groups, Delphi) to less formal ones (consultation, reac-tions to documents, etc.), as was the case for expert opinions. In other words, the contribution of decision-makers to knowledge syntheses can range from a complementary to an interpretive role (Pineault et al., 2007). Regardless of the particular device used, decision-makers’ viewpoints are considered through a deliberative/participation process that varies according to how close the final position is to decision-making. This process entails discussions and compromises among research-ers and decision-makers (Denis et al., 2004; Lomas et al., 2005; Tranfield et al., 2003). Hence, how to manage the process becomes a main challenge in producing knowledge syntheses intended to be useful for decision-making.

The fourth challenge was the timeliness of synthesis results (Innvaer et al., 2002; Lavis et al., 2005). The question here is how to produce timely results in a short period of time. The major limitation of systematic reviews is that they draw upon published articles that are based on past data. The situation, and particularly the context, may have changed over time. Whereas this con-straint may have a more limited implication for assessing context-free evidence, it becomes crucial in the case of context-sensitive evidence syntheses.

There are different ways of increasing timeliness of syntheses results. The first is through analy-sis of the logic of intervention by which theoretical plausibility can be ascertained, making possible theoretical inference to various situations where the same theoretical approach or model applies (Pawson, 2002; Pawson et al., 2005). A second way of achieving timeliness is through implementa-tion and contextual analyses. Such analyses make it feasible to determine if the same results are likely to be found in comparable contexts because contextual factors by themselves are very time-sensitive. Similarly if there is consistency in the results across different settings and contexts, the findings are more robust to context differences and consequently less time-sensitive. These two approaches are more in line with an interpretive approach to synthesis (Dixon-Woods et al., 2004). In our synthesis, we used both analysis of the logic of intervention and implementation analysis. A third way of achieving timeliness, more in line with an integrative approach, is triangulation of data reported in the past with data extrapolated to the future, in the way the Delphi method is often used. This is what we did. As shown earlier (Figure 6), different forms of syntheses refer to different points on the time horizon axis. Systematic reviews rest upon findings observed in a more or less distant past. Conversely, Delphi surveys are often used to forecast the future. The research collect-ive approach that we have experimented with is a promising formula that synthesizes ongoing or recently terminated research and thus yields results that are mainly characterized by their timeli-ness (Pineault et al., 2006, 2007). Data triangulation produced by various types of syntheses across a broad spectrum of time is best achieved by what can be called a meta-synthesis. This is what we attempted to do in the synthesis by triangulating the Delphi with the empirical data.

The fifth challenge consisted in presenting the results in a concise yet understandable form. This is a very important aspect of communication and exchange strategies between researchers and decision-makers. Researchers are often reluctant to simplify and summarize their findings, whereas decision-makers wish to see the main results and conclusions of syntheses (Lavis et al., 2005).

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 14: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Pineault et al.: Challenges in Producing Research Synthesis 149

Methodological details are important for researchers whereas decision-makers primarily base their confidence in the results on the credibility of the researchers rather than thoroughness of their methods (Patton, 1999). Trade-offs between summarizing and detailing lead to very uncomfortable situations for the researcher when constrained, for example, by the requirements of funding agen-cies to a 1-page main message, 3-page executive summary and 25-page detailed report format. They are then criticized if not enough details are provided (Lavis et al., 2005). Visual representations can be useful tools in these situations since they can aggregate a lot of quantitative data. This type of representation can be more difficult for qualitative than for quantitative data. In the synthesis, we used the Kiviat (1991) diagram, which presents a great amount of synthetic information in a simple and easily understandable form.

ConclusionTo be useful for policy- and decision-making, knowledge syntheses must address not only the ‘what’ questions but also the ‘how’ and ‘why’ questions. It is not sufficient to know whether inter-ventions are effective and efficient; it is also important for policy- and decision-makers to under-stand the conditions under which interventions are implemented to produce optimal results and what theory, implicit or explicit, underlies their actions.

Whereas the EBM tradition has proven to be most useful in providing tools for answering the ‘what’ questions, it has shown limitations in addressing the ‘how’ and ‘why’ questions. To answer these questions, it is necessary to mix quantitative and qualitative information, find new methods for assessing strength of evidence and combine empirical evidence and expert opinions. All this must be done with the expectation that syntheses will be more useful to decision- and policy-makers while maintaining an acceptable level of scientific rigour. The approach presented in this paper has tried to achieve this compromise and, by doing so, to provide a knowledge synthesis in PHC that can sustain a deliberative process of knowledge exchange between researchers and decision- and policy-makers, and eventually lead to more enlightened decisions and policies.

Acknowledgements

This project was commissioned and funded by Canadian Health Services Research Foundation, New Brunswick Department of Welfare, Saskatchewan Department of Health, Ministère de la Santé et des Services sociaux du Québec, and Health Canada. The authors wish to acknowledge the participation of Robert Geneau, Ronald Lebeau and Ghislaine Tré in producing the synthesis. They would like to express their gratitude to Louise Lapierre for her unfailing support throughout the project. Sylvie Gauthier, Audrey Couture and Isabelle Rioux revised the text and provided editorial assistance in preparing the manuscript. Finally, while the contributions of the partners’ representatives and scientific advisors are recognized, the views expressed in this article remain the sole responsibility of its authors. Jean-Louis Denis holds a chair jointly sponsored by CHSRF and CIHR. Jeannie Haggerty holds a Canada Research Chair on Primary Healthcare Organization.

References

Black, N. (2001) ‘Evidence Based Policy: Proceed with Care’, British Medical Journal 323(7307): 275–9.Canadian Health Services Research Foundation (CHSRF) (2010) URL (consulted Jan. 2010): http://www.

chsrf.ca/funding_opportunities/commissioned_research/polisyn/descrip_e.php Champagne, F., A.-P. Contandriopoulos and A. Tanon (2004) ‘A Program Evaluation Perspective in

Processes, Practices, and Decision-Makers’, in L. Lemieux-Charles and F. Champagne (eds) Using Knowledge and Evidence in Health Care, pp. 139–71. Toronto: University of Toronto Press.

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 15: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

150 Evaluation 16(2)

Contandriopoulos, A.-P., F. Champagne, J.-L. Denis and M.-C. Avargues (2000) ‘L’évaluation dans le domaine de la santé: Concepts et méthodes’, Revue d’épidémiologie et de santé publique 48(6): 517–39.

Cookson, R. (2005) ‘Evidence-Based Policy Making in Health Care: What it is and What it isn’t’, Journal of Health Services Research and Policy 10(2): 118–21.

Denis, J.-L., P. Lehoux and F. Champagne (2004) ‘A Knowledge Perspective on Fine-Tuning Dissemination and Contextualizing Knowledge’, in L. Lemieux-Charles and F. Champagne (eds) Using Knowledge and Evidence in Health Care, pp. 3–40. Toronto: University of Toronto Press.

Dixon-Woods, M., R. Fitzpatrick and K. Roberts (2001) ‘Including Qualitative Research in Systematic Reviews: Opportunities and Problems’, Journal of Evaluation in Clinical Practice 7(2): 125–33.

Dixon-Woods, M., S. Argawal, B. Young, D. Jones and A. Sutton (2004) ‘Integrative Approaches to Qualitative and Quantitative Evidence’, NHS National Institute for Health and Clinical Excellence, URL (consulted Aug. 2008): http://www.nice.org.uk/niceMedia/pdf/Integrative_approaches_evidence.pdf

Dixon-Woods, M., S. Argawal, D. Jones, B. Young and A. Sutton (2005) ‘Synthesising Qualitative and Quantitative Evidence: A Review of Possible Methods’, Journal of Health Services Research and Policy 10(1): 45–53.

Escofier, B. and J. Pages (1998) Analyses factorielles simples et multiples, 3rd edn. Paris: Dunod.Fiss, P. C. (2007) ‘A Set Theoretic Approach to Organizational Configurations’, Academy of Management

Review 32(4): 1180–98.Friedberg, E. (1993) Le pouvoir et la règle: Dynamiques de l’action organisée. Paris: Seuil.Haggerty, J., R. J. Reid, G. K. Freeman, B. H. Starfield, C. E. Adair and R. McKendry (2003) ‘Continuity of

Care: A Multidisciplinary Review’, British Medical Journal 327(7425): 1219–21.Innvaer, S., G. Visit, M. Trummald and A. Oxman (2002) ‘Health Policy-Makers’ Perceptions of their Use of

Evidence: A Systematic Review’, Journal of Health Services Research and Policy 7(4): 239–44.Kiviat, P. (1991) ‘Simulation, Technology, and the Decision Process’, ACM Transactions on Modeling and

Computer Simulation 1(2): 89–98.Klein, R. (2000) ‘From Evidence-Based Medicine to Evidence-Based Policy? ’, Journal of Health Services

Research and Policy 5(2): 65–6.Klein, R. (2003) ‘Evidence and Policy: Interpreting the Delphic Oracle’, Journal of the Royal Society of

Medicine 96(9): 429–31.Lamarche, P., M.-D. Beaulieu, R. Pineault, A.-P. Contandriopoulos, J.-L. Denis and J. Haggerty (2003a)

Choices for Change: The Path for Restructuring Primary Healthcare Services in Canada. Final Report. Canadian Health Services Research Foundation, URL (consulted Aug. 2008): http://www.chsrf.ca/final_research/commissioned_research/policy_synthesis/pdf/choices_for_change_e.pdf

Lamarche, P., M.-D. Beaulieu, R. Pineault, A.-P. Contandriopoulos, J.-L. Denis and J. Haggerty (2003b) Choices for Change: The Path for Restructuring Primary Healthcare Services in Canada – Appendices. Canadian Health Services Research Foundation, URL (consulted Aug. 2008): http://www.chsrf.ca/final_research/commissioned_research/policy_synthesis/pdf/choices_for_change_appendices_e.pdf

Lambert, T., L. Lebart, A. Morineau and P. Pleuret (1996) Manuel de référence de SPAD. Saint-Mandé, France: CISIA-CERESTA.

Lavis, J., H. Davies, A. Oxman, J.-L. Denis, K. Golden-Biddle and E. Ferlie (2005) ‘Towards Systematic Reviews that Inform Health Care Management and Policy-Making’, Journal of Health Services Research and Policy 10(suppl.1): 35–48.

Lebart, L., M. Piron and A. Morineau (2000) Statistique exploratoire multidimensionnelle, 3rd edn. Paris: Dunod.Lomas, J., T. Culyer, C. McCutcheon, L. McAuley and S. Law (2005) Conceptualizing and Combining

Evidence for Health System Guidance: Final Report. Canadian Health Services Research Foundation, URL (consulted Aug. 2008): http://www.chsrf.ca/other_documents/pdf/evidence_e.pdf

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 16: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

Pineault et al.: Challenges in Producing Research Synthesis 151

Mays, N., C. Pope and J. Popay (2005) ‘Systematically Reviewing Qualitative and Quantitative Evidence to Inform Management and Policy-Making in the Health Field’, Journal of Health Services Research and Policy 10 (suppl.1): 6–20.

Meyer, A. D., A. S. Tsui and C. R. Hinings (1993) ‘Configurational Approaches to Organizational Analysis’, Academy of Management Journal 36(6): 1175–95.

Miller, W. L., B. F. Crabtree, R. McDaniel and K. C. Stange (1998) ‘Understanding Change in Primary Care Practice Using Complexity Theory’, Journal of Family Practice 46(5): 369–76.

Oliver, S., A. Harden, R. Rees, J. Shepherd, G. Brunton, J. Garcia and A. Oakley (2005) ‘An Emerging Frame-work for Including Different Types of Evidence in Systematic Reviews for Public Policy’, Evaluation 11(4): 428–46.

Patton, M. Q. (1999) ‘Enhancing the Quality and Credibility of Qualitative Analysis’, Health Services Research 34(5/2): 1189–1208.

Pawson, R. (2002) ‘Evidence-Based Policy: The Promise of “Realist Synthesis”’, Evaluation 8(3): 340–58.Pawson, R., T. Greenhalgh, G. Harvey and K. Walshe (2005) ‘Realist Review: A New Method of Systematic Review

Designed for Complex Intervention’, Journal of Health Services Research and Policy 10(suppl.1): 21–34.Pineault, R., P. Tousignant, D. Roberge, P. Lamarche, D. Reinharz, D. Larouche, G. Beaulne and D. Lesage

(2006) ‘The Research Collective: A Tool for Producing Timely, Context-Linked Research Syntheses’, Healthcare Policy 1(4): 58–75.

Pineault, R., P. Tousignant, D. Roberge, P. Lamarche, D. Reinharz, D. Larouche, G. Beaulne and D. Lesage (2007) ‘Involving Decision-Makers in Producing Research Collective on Primary Healthcare in Quebec’, Healthcare Policy 2(4): 1–17.

Plsek, P. E. and T. Greenhalgh (2001) ‘Complexity Science: The Challenge of Complexity in Health Care’, British Medical Journal 323(7313): 625–8.

Reed, D. R., E. G. Price, D. M. Windish, S. M. Wright, A. Gozu, E. B. Hsu, M. C. Beach, D. Kern and E. B. Bass (2005) ‘Challenges in Systematic Reviews of Educational Intervention Studies’, Annals of Internal Medicine 142(12/2): 1080–9.

Sheldon, T. A. (2005) ‘Making Evidence Synthesis More Useful for Management and Policy-Making’, Journal of Health Services Research and Policy 10(suppl.1): 1–5.

Tranfield, D., D. Denyer and P. Smart (2003) ‘Towards a Methodology for Developing Evidence-Informed Management Knowledge by Means of Systematic Reviews’, British Journal of Management 14(3): 207–22.

Victoria, C. G., J.-P. Habicht and J. Brice (2004) ‘Evidence-Based Public Health: Moving Beyond Random-ized Trial’, American Journal of Public Health 94(3): 400–5.

Raynald Pineault MD, PhD, is an emeritus professor in the Department of Social and Preventive Medicine at the University of Montreal. He is also a medical advisor for the Direction of public health of Montreal and the Institut national de santé publique du Québec. Correspondence address: Direction de santé publique de Montréal, 1301 rue Sherbrooke Est, Montréal Qc, H2L 1M3, Canada. [email: [email protected]]

Paul Lamarche is a full professor in the Department of Health Administration at the University of Montreal. Correspondence address: GRIS, Université de Montréal, C.P. 6128, Succ. Centre-ville, Montréal Qc, H3C 3J7, Canada. [email: [email protected]]

Marie-Dominique Beaulieu full professor in the Department of Family Medicine at the University of Montreal, holds the Doctor Sadok Besrour chair on family medicine. Correspondence address: Clinique de médecine familiale CHUM – Campus Notre-Dame, 2025 rue Plessis, Montréal Qc, H2L 2Y4, Canada. [email: [email protected]]

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from

Page 17: Evaluation - ENAPcerberus.enap.ca/GETOSS/Publications/Lists/Publications... · 2012. 12. 18. · integrating quantitative and qualitative studies and assessing strength of evidence

152 Evaluation 16(2)

Jeannie Haggerty assistant professor in the Department of Community Health at the University of Sherbrooke, holds a Canada Research Chair on the impact of primary health care on population. Correspondence address: Centre de recherche, Hôpital Charles LeMoyne, 3120, boul. Taschereau, Greenfield Park, Qc J4V 2H1, Canada. [email: [email protected]]

Danielle Larouche is working with the Interdisciplinary Health Research Group (GRIS) at the University of Montreal. Correspondence address: GRIS, Université de Montréal, C.P. 6128, Succ. Centre-ville, Montréal Qc, H3C 3J7, Canada. [email: [email protected]]

Jean-Marc Jalhay is a Research Fellow in the Department of Community Health at the University of Montreal. Correspondence address: Institut Emile Vandervelde, 13 boulevard de l’empereur, 1000 Bruxelles, Belgique. [email: jean-marc [email protected]]

André-Pierre Contandriopoulos is a full professor in the Department of Health Administration at the University of Montreal. Correspondence address: Department of Health Administration, Université de Montréal, C.P. 6128, Succ. Centre-ville, Montréal Qc, H3C 3J7, Canada. [email: [email protected]]

Jean-Louis Denis full professor in the Department of Health Administration at the University of Montreal, holds a CIHR/CFHSR chair on the transformation and governance of health care systems. Correspondence address: Department of Health Administration, Université de Montréal, C.P. 6128, Succ. Centre-ville, Montréal Qc H3C 3J7, Canada. [email: [email protected]]

at ENAP. BIBLIOTHÈQUES. on December 18, 2012evi.sagepub.comDownloaded from


Recommended