+ All Categories
Home > Documents > 1984. Oman & Chitwood. Management Analysis and Evaluation

1984. Oman & Chitwood. Management Analysis and Evaluation

Date post: 06-Jul-2018
Category:
Upload: betica00
View: 214 times
Download: 0 times
Share this document with a friend

of 23

Transcript
  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    1/23

    283

    MANAGEMENT EVALUATION STUDIES

    Factors Affecting the Acceptanceof Recommendations

    RAY C. OMAN

    Department of the Army

    STEPHEN R. CHITWOOD

    George Washington University

    Thousands of management evaluations are conducted in the federal government each

    year. Little information exists about the nature or effectiveness of these evaluations even

    though they cost more than $200 million annually. This article explores the relationshipsbetween kinds of evaluations, analytic methods, and interpersonal processes and the

    acceptance of recommendations by decision makers. Detailed typologies based on areview of the literature provide the basis for quantitatively testing concepts about theutilization of evaluations against empirical data. Two-hour interviews were conductedwith 50 evaluators and decision makers about randomly selectedmanagement evaluations.

    Selected characteristics of the nature, methodology, and process of evaluations were

    found to be related to acceptance. Some factors are structural and beyond the control ofthe evaluator, while others are behavioral and within the power of the evaluator to

    influence.

    ver the past decade there has been widespread criticism of thefederal government for poor management. In response to thegrowing censure, there have been attempts, in the past several years, toreform basic decision processes including planning, budgeting, evalua-

    tion, personnel management, procurement, and contracting. Federal

    agencies have also been subject to numerous personnel ceilings, hiringfreezes, and special budget controls in areas including travel expenditure,

    paperwork, and ADP. Recently, of course, the sizable budget reductionsproposed by the administration and approved by Congress have createdan austere, cutback environment in most civilian agencies.

    EVALUATION REVIEW, Vol 8 No. 3, June 1984 283-305

    @ 1984 Sage Publications, Inc

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    2/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    3/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    4/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    5/23

    287

    Carlotta Young (1978), in a research paper on evalution utilization

    developed for the General Accounting Office, identified six factors inthe literature associated with utilization. They are as follows:

    ~ the political decision-making environment~ orgamzation aspects of the management environment

    ~ the commitment and involvement of decision makers and evaluators

    ~ the appropriateness of the research questions~ methodology, and

    ~ dissemmation and reportmg issues

     An examination of the evaluation literature reveals many sources

    listing factors that affect utilization. Few of the hypothesized factors,however, are examined in any detail. As Patton (1977: 142) argues,

    The issue at this time is not the search for a single formula ofutilization success, northe generation of ever-longer lists of possible factors affecting utilization. The taskfor the present is to identify and define a few key vanables that may make a majordifference in a significant number of evaluation cases.

    Following Patton’s suggestion, the approach in this research effort isto focus on a few key factors-namely, the kind of study, the nature ofthe recommendations, and the methodology used in the conduct of the

    study.There are a number of sources in program evaluation literature that

    pertain directly to the research questions in this proposal. Adams, in

    developing a prescriptive package for evaluation research in correc-tions, found that &dquo;weaker&dquo; research designs were more successful in

    effecting organization change. Adams concludes that the biggestpayoffs come from &dquo;weak&dquo; or nonrigorous research designs, such as casestudies and surveys. Adams (1975: 115) goes on to note:

     Although the reasons for this are not fully understood, some hypotheses may bestated: (a) these non-rigorous styles better fit the decisionmakmg styles and needsof administrators; (b) there is greater pressure on corrections for system improve-ment than for client improvement, and these studies provide adequate rationalesfor system change; (c) in times ofrapid change, conditions are not favorable for theuse of strong research designs; and (d) correctional administrators have not yetsupported ngorous designs to the extent required to make them generally effective.

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    6/23

    288

    In addition to the research design aspect of methodology, the degreeof decision-maker participation in the evaluation is another area to be

    explored in the dissertation research. Although participation often is

    listedas one of

    many factors affectingthe

    utilizationof

    evaluationfindings, the degree or nature of participation is seldom defined orexamined. A study by Waller et al. ( 1979: 11), however, appears to placeuser involvement in the forefront. The report concludes that,

    the only characteristic of an evaluation system associated with utility was the

    degree of involvement of the user in an evaluation activity.

    Mark Van de Vall has done research about the effect of

    methodologyand user participation on utilization of applied social science and social

    policy findings. Van de Vall and Bolas (1979) have concluded that,

    the impact of social policy research upon organizational decisions is higher whenthe research sponsor and research consumer are identical or closely linked, rather

    than consistmg of two separate and independent organizations; and

    Projects of social policy research accompanied by a steering committee consistingof representatives from the research team, the research sponsor and researcher

    consumer(s) tend to score higher on policy impact than projects lacking a steeringcommittee.

    Van de Vall, Bolas, and Kang (1976: 172-173) examined the effect of

    methodology on the use of research findings in the area ofindustrial andlabor relations in the Netherlands. Some of their conclusions that

    provide useful background information for this research effort are:

    The use of qualitative methods in applied social research leads to a higher impactupon industrial policy making than using quantitative methods, particularlytabu-lar analysis.

    The more the methodological mixture of applied social research favors qualitativemethods, the more intensively the projects are utilized m company policies.

    Factors affecting the acceptance of recommendations may be groupedas follows: (1) small scale, micro factors that are partially in the realm ofthe

    analyst’s control,and

    (b) larger organizationor macro factors which

    are outside of the control of the analyst performing the study. The

    emphasis of this research effort is on the small scale, micro factors.Macro factors in the larger organization environment influencing the

    acceptance ofstudy findings include budget cuts or increases, changes in

    personnel ceilings and hiring freezes, reorganizations, and changes in

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    7/23

    289

    management personnel. Factors in the larger organization environmentcan affect the acceptability of recommendations. For example, in timesof personnel or budget cuts, recommendations dealing with cost-

    effectiveness, cost reduction, and the

    streamliningof

    procedures,are

    likely to be viewed more favorably than they would in times of

    expansion.Micro factors, the focus of this effort, include dimensions such as

    methodology, the way decisions were made about how the study wouldbe conducted, the kind of information (data) collected, who collectedthe information, how the information (data) was analyzed and inter-

    preted, and how recommendations were developed. Micro factors alsoinclude the characteristics of the kind of

    studyconducted such as the

    purpose of the study, the topical content, the initiator, and the retro-

    spective or prospective emphasis. The analyst may be in a position tocontrol or to alter a number of these factors.

    FRAMEWORK FOR ANALYSIS

    This research explores the relationship among the kinds of manage-ment evaluations, the ways that they are conducted, and the acceptanceof recommendations by decision makers. To examine this questionthree typologies, or classification schemes, have been established relat-

    ing to (1) the nature of the study, (2) the methodology used, and (3) the

    interpersonal processes used in the effort.The first classification scheme deals with the nature of management

    evaluation studies. The framework is based on the

    followingfactors:

    (1)purpose of the study, (2) topical content area of the study, (3) initiator ofthe study, (4) organizational location of the unit studied, (5) organiza-tional location of the MA unit, and (6) the prospective or retrospectivenature of the study. The six factors and several subfactors that consti-tute the typology allow management evaluation studies to be placed ina framework that allows examination of some of the relationships (seeFigure 1).The conduct of

    managementevaluations

    maybe viewed in terms of

    &dquo;content&dquo; and &dquo;process.&dquo; The second and third typologies deal withthese areas. For the purposes of this analysis, &dquo;content activities&dquo; are the

    analytic methodologies employed in the study. &dquo;Process&dquo; pertains to the

    interpersonal processes used and is concerned with who is involved in

    making decisions about how to conduct the evaluation and who carries

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    8/23

    290

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    9/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    10/23

    292

    out the study activities. Typologies have been developed for both

    methodology or &dquo;content activities&dquo; and &dquo;interpersonal processes.&dquo;Information about how these &dquo;content&dquo; and &dquo;process&dquo; activities were

    carried out provides a basis for classifying studies by two sets offactors-scientific-political and solitary-participatory.

    Study methodology usually is based on the steps of the scientificmethod-that is, problem definition, data collection, data analysis and

    interpretation, and development of conclusions and recommendations. A wide range of quantitative and qualitative techniques can be used to

    carry out these generic processes. The classification scheme for studymethods is based on these various analytic techniques (see Figure 2).

    Interpersonal processes refers to the ways the methodological, or&dquo;content,&dquo; activities of the study are carried out. &dquo;Process&dquo; deals withwho was involved in making decisions about and in carrying out thevarious study activities. The typology for process aspects of theevaluation consists of eight factors (see Figure 3).

    METHODOLOGY

     An exploratory, descriptive approach was selected for this researchbecause little theory has been developed about factors affecting study

    acceptance. The research methodology applied involves a series of

    structured, multiple case studies. Information for the cases was gatheredby interviews, questionnaires, and examination of study reports.

    The twelve cabinet-level civilian departments (with all of their

    bureaus) and the independent agencies ofthe General Services Adminis-tration (GSA), the National Aeronautics and Space Administration

    (NASA), and the Veterans Administration (VA) were the organizationalentities from which particularMA units were chosen. GSA, NASA, andVA were included from among the independent agencies because theyhave relatively large numbers of management analysts.

    LIMITATIONS OF THE RESEARCH

    There are a number of limitations of the research that need to be

    recognized. Three important limitations deal with the study methodsand approach. First, although &dquo;acceptance&dquo; is related to a number of

    factors, the quality, timeliness, and the relevance of recommendationswere not examined. Second, &dquo;acceptance&dquo; was based on the opinions

    (Continued on page 299)

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    11/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    12/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    13/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    14/23

    296

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    15/23

    297

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    16/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    17/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    18/23

    300

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    19/23

    301

    The largest study, in terms of analyst’s time meeting these criteria, wasselected for examination.

     A minimum ofthree interviews were conducted in eachMA unit. The

    first interview was conducted with the head of the MA unit toget

    an

    overview of the unit and to select one study for detailed examination.

    Second, a structured interview with the analyst who conducted the studywas carried out. In most cases, it was necessary to conduct two

    interviews with each analyst. A copy of the study report was reviewedbefore the interview with the analysts(s) who conducted the study.Finaly, a decision maker responsible for accepting the recommendationwas interviewed.

    FACTORS INFLUENCING STUDY ACCEPTANCE

    BASED ON INTERVIEW DATA 

     AND A TABULAR ANALYSIS

    Information about factors that influenced the acceptance of recom-

    mendations is based on facts about each study gathered in interviews,and on a tabular analysis of study characteristics. The more generalfactors based on interview information are presented first, followed by adiscussion of factors based primarily on the tabular analyses. Manyfactors can influence the acceptance of study findings. Sometimes thesefactors can operate individually, but often one factor operates in con-

    junction with other factors.Larger studies that took a long period of time had a lower level of

    recommendation acceptance. A few of the studies took two or more

    years to complete. These studies had a low level of acceptance. Two ofthese studies also were begun in one political administration and com-

    pleted in another. Interviewees noted that studies bridging a change ofadministration often find the acceptance problematic.

    Studies that were done in isolation from the unit studied had a low

    level of acceptance. One study was conducted in isolation because it wasfelt that disclosure of the study would result in its early demise. Whenthe study was presented it had a low level of acceptance. Other studieswere not done in complete isolation, but there was not much interactionbetween the analyst and the decision makers in the unit studied. Thesestudies had lower-than-average levels of acceptance. This finding iscorroborated in the program evaluation utilization literature. John

    Waller et al. (1979) found that the involvement of the user was the

    primary factor associated with utilization.

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    20/23

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    21/23

    303

    FACTORS AFFECTING ACCEPTANCE BASED ON TABULAR ANALYSIS

    The typologies developed about kind of study, study methods, and

    study process were examined to establish factors related to acceptance

    (see Figures 1, 2, and 3). These typologies contained both structural andbehavioral factors. Structural factors include, for example, the organi-zation location of theMA unit and of the unit studied. These factors are

    generally outside of the control of the analyst conducting the study.Behavioral factors refer to those things over which the managementanalyst has more control, such as who is involved in the study processand the nature of involvement. Other factors such as methods of infor-

    mation collection and data analysis are partly within the control of the

    management analyst. A number of structural and behavioral factorswere found to be associated with acceptance.

    The kind of study influences the level of acceptance (see Figure 1).For example, methods and procedures studies have a higher level of

    acceptance than organization placement and policy studies. Studiesinitiated by top agency or bureau management have a higher acceptancethan studies initiated by top administrative management. Retrospectiveevaluations have a lower level of acceptance than prospective studies.

    Some study methods are associated with higher or lower levels of

    acceptance (see Figure 2). For example, management evaluations thatdid not structure or model but simply described management problems,had a higher level of acceptance than studies that attempted to structurethe problem and develop quantitative relationships among variables butdid not present descriptive information. Similarly, Van de Vall et al.

    (1976: 173-174) have observed that &dquo;the use of qualitative methods in

    appliedsocial research leads to a

    higher impactupon industrial

    policymaking than using quantitative methods. &dquo; Studies that combined quan-titative methods with descriptive approaches appeared to have averageor above-average acceptance. Our sample of studies using advanced

    quantitative methods was very small.There was some variation in acceptance based on the methods of

    information collection used in the study. For example, studies that set

    up experiments and attempted to gather statistical data empirically had

    lower-than-averagelevels of

    acceptance, especiallyif

    qualitativemethods

    were not used also. Studies that used structured and unstructured

    interviews appeared to have a higher-than-average level of acceptance.Study process, referring to who was involved in making decisions

    about how to conduct the study as well as the actual carrying out of the

    study, also appears to be associated with levels of acceptance of

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    22/23

    304

    recommendations (see Figure 3). For example, studies in which the

    management analyst together with personnel in the unit studied definedthe problem, and studies in which the problem was defined by aninterunit study team, had higher-than-average levels of acceptance.Conversely, studies defined by the person requesting the study in-

    dividually and by the management analyst individually had lower-than-

    average levels of acceptance. The data suggest that studies defined by aninteractive process between the analyst and unit chief or personnel in theunit studied have higher-than-average levels of acceptance.

    Studies in which information collection was done by interunit studyteams or by the analyst and people in the unit studied had above-averagelevels of

    acceptance.Those studies where teams of

    analystshandled

    information collection had lower levels ofrecommendation acceptance.In general, the data suggest that for studies in which the analyst and

    personnel in the units studied participated in problem definition, data

    collection, analysis, and the development of conclusions and recom-

    mendations, acceptance was well above average. The level was muchlower when these processes were handled by teams of analysts ratherthan individual analysts. Further, the data suggest that the involvementof

    peoplefrom the unit studied in interunit

    studyteams

    producesa

    higher-than-average level of acceptance. Among others, Leviton and

    Hughes ( 1981) and Waller et. al. (1979) have found positive relationshipsbetween the level ofuser involvement and utility ofprogram evaluations.

    REFERENCES

     ADAMS, S. (1975) Evaluative Research in Corrections: A Practical Guide. U.S.

    Department of Justice, Washington, DC: Government Printing Office.

    COX, G. B. (1977) "Managerial Style: implications for the utilization of programevaluation information." Evaluation Q. 1(August): 499-509.

    LEVITON, L. C. and E.F.X. HUGHES (1981) "A review and synthesis of research onutilization of evaluations."Evaluation Rev. 5: 525-545.

    PATTON, M. Q., P. S. GRIMES, K. M. GUTHRIE, N. J. BRENNAN, B. D. FRENCH,and D. A. BLYTHE (1977) "In search of impact: an analysis of the utilization offederal health evaluation research," in Carol H. Weiss (ed.) Using Social Research in

    Public Policy Making. Lexington, MA: D. C. Heath.U.S. Office of Personnel Management [OPM] (1978) Federal Civilian Workforce

    Statistics: Occupations of Federal White Collar Workers. Pamphlet 56-14, October

    31. Washington DC: Government Printing Office.

    ———(1972a) Position-Classification Standards (TS 9) February. Washington DC:Government Printing Office.

    ———(1972b) Qualifications Standards (TS 141), February. Washington DC Govern-ment Printing Office.

     at UCLA on September 28, 2015erx.sagepub.comDownloaded from 

    http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/http://erx.sagepub.com/

  • 8/17/2019 1984. Oman & Chitwood. Management Analysis and Evaluation

    23/23

    305

    U.S. Office of Management and Budget [OMB] (1978a) Bulletin No. 78-12, April.Washington DC: Government Printing Office.

    ———(1978b) Resources for Management  Analysis, Washington DC: Government

    Printing Office.

    ———(1977)Resources for

    ProgramEvaluation,

    WashingtonDC: Government

    PrintingOffice.

    VAN de VALL, M. D. and C. BOLAS (1979) "The utilization of social policy research: an

    empirical analysis of its structure and functions." 74th Annual Meeting of the

    Sociological Society, Boston, Massachusetts, August 27-31.———and T. S. KANG (1976) "Applied social research in industrial organizations: an

    evaluation of functions, theory, and methods." J. of Applied Behavioral Sci. 12, 2

    (April/May/June): 172-173.

    WALLER, J. D. Developing Useful Evaluation Capability: Lessons from the ModelEvaluation Program, U.S. Department of Justice. Washington DC: Government

    Printing Office.YOUNG, C. J. (1978) "Evaluation utilization," Presented at the Evaluation Research

    Society Second Annual Meeting, Washington, D. C., November 2-4.

    Ray C. Oman, Ph. D., a supervisory management analyst, is a Branch Chief in the Plansand Policy Division of the Management Systems Support Agency, Department of the

     Army. He has graduate degrees from Pennsylvania State University and GeorgeWashington University, specializing in program analysis and evaluation, management,and finance and economics. One of his current research interests is the

    relationshipbetween the methodology and processes used in management studies and the acceptanceof findings by decision makers.

    Stephen R. Chitwood is a Professor of Pubhc Administration at George WashingtonUniversity. He holds a Ph.D. in public administrationfrom the University ofSouthern

    California and J. D. from George Washington University. One of his current researchinterests is the relationship between the methodology and processes used in managementstudies and the acceptance of findings by decision makers.


Recommended