+ All Categories
Home > Documents > Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf ·...

Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf ·...

Date post: 09-Mar-2018
Category:
Upload: vanphuc
View: 212 times
Download: 0 times
Share this document with a friend
44
American Council on Education Center for Policy Analysis Choosing Among Surveys and Other Assessments of College Quality Victor M.H. Borden with Jody L. Zak Owens Quality: Measuring Association for Institutional Research for Management Research, Policy Analysis, and Planning Victor M. H. Borden is associate vice chancellor for information management and institutional research, and associate professor at Indiana University Purdue University Indianapolis. Jody L. Zak Owens is a research assistant and graduate student in higher education and student affairs at Indiana University.
Transcript
Page 1: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on EducationCenter for Policy Analysis

Choosing Among

Surveys and Other

Assessments of

College Quality

Victor M.H. Borden

with

Jody L. Zak Owens

Quality:Measuring

Association for Institutional Researchfor Management Research, Policy Analysis, and Planning

Victor M. H. Borden is associate vice

chancellor for information management

and institutional research, and associate

professor at Indiana University Purdue

University Indianapolis.

Jody L. Zak Owens is a research assistant

and graduate student in higher education

and student affairs at Indiana University.

Page 2: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

Copyright © 2001

American Council on EducationOne Dupont Circle NWWashington, DC 20036-1193

Association for Institutional Research114 Stone BuildingFlorida State UniversityTallahassee, FL 32306-4462

All rights reserved. No part of this book may be reproduced ortransmitted in any form or by any means electronic or mechani-cal, including photocopying, recording, or by any information storage and retrieval system, without permission in writing fromthe publisher.

Single copies of this publication may be purchased from theAmerican Council on Education for $15.00 each plus shippingand handling. Multiple copies are available for $10.00 each.Orders may be paid by VISA®, check, or money order (madepayable to the American Council on Education) and sent to:

ACE Fulfillment ServiceDepartment 191Washington, DC 20055-0191(301) 604-9073Secure fax line: (301) 604-0158PMDS Item Number: 309113

Association for Institutional Researchfor Management Research, Policy Analysis, and Planning

Page 3: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research iii

Table of Contents

Introduction ....................................................................................................1

General Issues..................................................................................................3

National Assessments of Institutional Quality ..................................................5

Using Assessment Results Effectively ..............................................................11

Conclusion ....................................................................................................17

Table 1. Instrument, Administrator, Purpose, Use of Data, History, and Information Collected ....................................................18

Table 2. Target Institutions and Samples, Participation, Format, Administration Procedure, and Timeline............................................26

Table 3. Reporting, Data Availability, Local Items, Costs, and Contact Information..........................................................................34

Page 4: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 1

Introduction

Does this fictional solicitation sound familiar? In recent years, a proliferation of

national assessments of institutional quality has emerged in response to increasing

demands for accountability and consumer information. Many of these assessments

rely on survey responses from current and former students. They operate on a coop-

erative system in which campuses pay to participate in the survey and, in return,

receive one or more reports of the results and, sometimes, the raw data for further

local analysis. Other assessments use data already collected from students when they

take college entrance examinations. Standardized tests of college-level critical

thinking and subject area achievement also are available, allowing faculty and

Dear College President:

As the public and political leaders have come to perceive higher education as both

more important and more expensive than ever, demand has grown for account-

ability data and consumer information on the relative quality of individual col-

leges. The Survey of College and University Quality (SCUQ) was developed by

leaders in the field of higher education assessment to help postsecondary institu-

tions meet the demands of governing boards, accrediting agencies, and other

stakeholders. Participating institutions have found this to be a rich source of data

for marketing and recruitment as well. Perhaps most importantly, college and

university faculty have embraced the results of the SCUQ as the most credible and

useful evidence of student learning in college.

We invite you to join the hundreds of leading colleges and universities that

participate in this survey . . .

Page 5: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

2 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

administrators to compare the perfor-mance of students progressing throughtheir programs with that of students atother colleges and universities. Collegepresidents and provosts often decide toparticipate in these efforts, but they maydo so with little information on how bestto evaluate these assessments and howto determine which assessment will pro-vide the most useful information fortheir campus.

The purpose of this guide is to articu-late a set of questions and issues thatcampus leaders can review when decid-ing whether to participate in a given survey or use a specific assessmentinstrument. The guide also describessome of the major national surveys andassessments. Although the guide doesnot rate or recommend these services, itsuggests the criteria that campus leadersshould employ to determine the use andusefulness of any such instrument or ser-vice, based on specific campus needs,capabilities, and goals.

This guide is divided into three majorsections. The first section poses somegeneral questions that are important toconsider before deciding whether to par-ticipate (or continue to participate) in anational assessment. The second sectionprovides common descriptive informa-tion for some of the national assessmentsthat were popular when the guide waswritten. The third section reviews morespecific questions and issues regardingthe choice of a specific instrument or ser-vice and how to optimize participation.

The appendix provides a tabular com-parison of the major instruments and services reviewed in the guide. New products and services likely will becomeavailable and existing ones transformedor even discontinued after publication of this guide. The Association forInstitutional Research will maintain anupdated version of the appendix tables onits web site at http://www.airweb.org.

The next section of this guide posessome general questions to considerbefore engaging in any of these assess-ment efforts.

Page 6: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 3

GeneralIssues

Do these assessments live up to their promises?

As with all assessment efforts, the value of a national assessment survey or service

depends on whether faculty, administrators, and staff members can use the results

to support their ongoing processes and activities. Even if they see the questions and

results as interesting and informative, they may not find the information useful.

This guide’s descriptive information regarding specific instruments and services

may help an institution determine whether a particular instrument is relevant to its

needs. However, the guide cannot answer questions regarding an institution’s capa-

bility for most effectively using any particular type of assessment information.

The assessments described in this guide can be considered information tools

for both accountability and improvement. Their usefulness depends on three

general criteria:

� The appropriateness of the tool for the specific job at hand.

� The skills and experiences of users.

� The availability of sufficient financial, personal, and material resources.

How do we determine which survey is best suited to our purposes?

If you can ask this question effectively, you are halfway to choosing an appropriate

instrument. The key to determining which instruments and assessment will work

best for your institution is articulating a shared purpose among those most likely to

use the results. For example, finding out about entering students’ expectations and

attitudes can help only if that information can be used by academic and student

support service managers to develop, refine, and evaluate support programs; by

Page 7: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

4 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

marketers and recruiters to improvestrategies for attracting students; or byexternal affairs staff to develop print andelectronic publications.

Who at my institution needs to beinvolved?In addition to involving the faculty, staff,and administrators most likely to use theresults, you should consider involvingfaculty and staff with expertise in institu-tional research, higher education assess-ment, public opinion polling, or relatedareas. If you have an institutionalresearch or assessment office, it is espe-cially important that you confer withthose staff prior to committing to anational assessment. The proliferation ofsurveys and assessment instruments atall levels—in the classroom, by depart-ments and programs, by campus offices,and so forth—has resulted in exception-ally high demand for student time andattention. This demand is beginning tocompromise students’ responsiveness aswell as the quality of those responses.Staff in a centralized institutionalresearch or assessment office often canbest manage the overall load of assess-ment activity and help determine thetechnical merits of a specific instrumentor service.

What is generally involved in participating?Participation requirements vary amongthe national assessments considered inthis guide, but there is always some sig-nificant institutional commitmentbeyond the cost of participation. Forsurvey instruments, campus staff mustgenerate at least a sample of students tobe queried or tested. In many cases,campus staff administer the instrument,either in a group setting (for example, atfreshman orientation) or through class-room, electronic, or mail distribution.Typically, the supplier processes com-pleted instruments and prepares areport for the institution. The supplieralso may provide customized reports foran additional cost. Often the most usefulinformation comes from the subsequentanalyses that campus faculty and staffperform for specific decision applica-tions. The direct cost of participation is less than half of the total resourcecommitment required. However, thecost of participating in one of theseassessments is likely to be small com-pared to the cost of improvements inprograms, services, or accountabilitythat may be facilitated by assessmentresults. As you consider participation, itis advisable to think about the resourcesthat you are willing to commit to follow-up activities based on the results of theassessment.

The next section introduces some ofthe most popular national assessmentsurveys and services currently availablefor institutional use. It also provides anoverview of the detailed informationpresented in the appendix tables.

Page 8: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 5

National Assessments of Institutional Quality

The three tables at the end of this guide summarize the characteristics of 27 national

assessment instruments and services. The first 21 instruments and services

assess the attitudes, experiences, and learning goals and gains of entering students

(6), various groups of enrolled undergraduates (8), student proficiencies and

learning outcomes (5), and alumni (2). Two services offer a series of instruments for

students at varying points in their academic careers. The final four instruments and

services assess institutional and program effectiveness through the views of various

constituents, including faculty, administrators, students, and board members.

Profiles of entering students

UCLA’s Higher Education Research Institute (HERI) has been conducting the

Cooperative Institutional Research Program (CIRP) Freshman Survey for more than

30 years. The resulting annual national report receives considerable press attention

as the primary indicator of trends in new college student attitudes, expectations,

and experiences. By some measures, this is the most widely used freshman survey,

with more than 1,700 institutional participants since 1966. Any type of college or

university can use the CIRP Freshman Survey, but HERI also offers the Entering

Student Survey (ESS), tailored to the needs of two-year public and private colleges.

Many college and university presidents do not realize that most students complete

one of two entering student surveys before they begin college. When students take

the SAT or ACT college entrance exams, they complete an extensive information

form that includes questions about their expectations, attitudes, and past academic

behaviors. Moreover, the results of these surveys are available to colleges and

Page 9: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

6 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

universities at little or no cost. Schoolsthat use ACT COMPASS placement testservices also can inexpensively incorpo-rate the student profile questionnaireinto their data collection processes.

Unlike the CIRP survey, the ACT andSAT profile questionnaires are adminis-tered at varying points in time, accord-ing to when students take their entranceexams (which can range from sopho-more to senior year in high school). Thedata from these profiles can be useful,but they do not replace a true survey ofentering students that is administeredimmediately prior to, or at the beginningof, a student’s college career.

The College Board offers two versionsof an entering student survey—theAdmitted Student Questionnaire (ASQ)and the ASQ Plus—that focus on students’experiences with the college admissionsprocess. The ASQ provides feedback onthe marketing and recruitment functionsand processes for college admissions andso most directly serves enrollment man-agement operations.

It is often useful to track changes instudents’ attitudes, expectations, andexperiences as they progress throughcollege. HERI’s College Student Survey,described in the next section, providesthis possibility as a follow-up to the CIRPfreshman survey. The last survey consid-ered in this section, the College StudentExpectations Questionnaire (CSXQ),allows for the same type of tracking. The CSXQ is a prequel to the longer-standing College Student ExperiencesQuestionnaire (CSEQ) reviewed in thenext section. The CSXQ gathers baselineinformation from students regardingtheir expectations for their forthcomingeducational experiences, as assessedsubsequently in the CSEQ.

Experiences of enrolled undergraduatesHERI’s College Student Survey (CSS)and the CSEQ (administered by theIndiana University Center forPostsecondary Research and Planning[CPRP]) are two relatively long-standingassessments of the undergraduate stu-dent’s college experiences. As a follow-up to the freshman survey, the CSSfocuses on students’ level of satisfactionwith various aspects of their collegeexperiences. Although the CSEQ alsoincludes a satisfaction index, this surveyfocuses more on students’ views of theirlearning experiences. The CSEQ isguided by the principle that studentslearn best when actively engaged in col-lege activities and experiences.

Both the CSS and CSEQ focus on thefour-year baccalaureate experience.However, a Community College StudentExperiences Questionnaire (CCSEQ)also is available through the Universityof Memphis Center for the Study ofHigher Education. The CCSEQ followsCSEQ principles but targets the nontra-ditional, commuter students who typi-cally attend community colleges. TheAmerican Association of CommunityColleges (AACC) and American CollegeTesting (ACT) have teamed up to pro-duce Faces of the Future, a survey thatcaptures the background characteristicsand academic interests of communitycollege students taking either credit-bearing or non-credit classes.

Recently, faculty at IndianaUniversity’s CPRP and UCLA’s HERIhave contributed to the development ofnew undergraduate student assessmentsthat are closely tied to national efforts attransforming undergraduate education.The IU Center now administers theNational Survey of Student Engagement

Page 10: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 7

(NSSE), which was developed by a panelof leading assessment scholars as amodel for quality in undergraduate education. The NSSE uses principlessimilar to those that guided the develop-ment of the CSEQ; however, the CSEQis a longer instrument that covers spe-cific outcomes and experiences in moredetail. The NSSE, although relativelynew, has a larger base of institutionalparticipants than the CSEQ.

HERI’s newest assessment instru-ment was developed in collaborationwith the Policy Center on the First Yearof College at Brevard College. Your FirstCollege Year (YFCY) is both a follow-upto the CIRP freshman survey and anassessment of students’ experienceswith first-year programs such as learn-ing communities, residential interestgroups, and introductory courses.Similar to the NSSE, the YFCY focuseson specific types of programs and stu-dent behaviors that have emerged fromhigher education literature as best prac-tices in undergraduate learning.

During the past 10 years, many insti-tutions have adopted the Noel-LevitzStudent Satisfaction Inventory (SSI) aspart of a strategic enrollment manage-ment initiative. The Noel-Levitz instru-ment uses a gap analysis technique toarray students’ satisfaction against theirperceived importance of various aspectsof the college experience. Noel-Levitzalso has recently released a version ofthe SSI tailored to the needs of adultlearners, called the Adult StudentPriorities Survey (ASPS).

Student proficiencies and learning outcomesThe enrolled student surveys reviewedto this point focus on attitudinal andbehavioral aspects of the student experi-ence. Faculty at a number of collegesand universities now use one of severalassessment instruments focusing on stu-dent learning outcomes in specific con-tent areas and general education. ACT’sCollegiate Assessment of AcademicProficiency (CAAP) consists of a set ofmodules that assess student proficiencyin writing, reading, math, science rea-soning, and critical thinking. Users maycustomize these modules based on insti-tutional needs. The Academic Profiledeveloped by the Educational TestingService (ETS) provides a similar assess-ment of general education skills andproficiencies. ETS also offers a specificassessment for critical thinking as well as a set of major field tests in 14academic subject areas, such as biology,economics, music, and psychology. The Project for Area ConcentrationAchievement Testing (PACAT), housedat Austin Peay State University, offersflexible content in eight subject-specificArea Concentration Achievement Tests(ACAT), which it can customize for individual institutions. PACAT alsooffers three additional subject area tests that do not yet have the flexiblecontent option.

Institutions typically use the generaleducation and subject area instrumentsfrom ACT, ETS, and PACAT for internalassessment and improvement purposes.However, many institutions tap into the rich source of learning outcomesinformation from these instruments todemonstrate their effectiveness toaccrediting agencies. College adminis-

Page 11: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

8 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

trators, faculty, and staff should familiar-ize themselves with the different optionsfor assessing institutional effectivenessand the college student experience.Using too many assessment instruments,especially in uncoordinated ways, canundermine efforts to assess institutionalquality by compromising the quality ofstudent participation in these efforts.

Alumni status and achievement Graduates can provide valuable informa-tion about how their experiences in col-lege served them in pursuing theirpostgraduate goals and objectives. TheComprehensive Alumni AssessmentSurvey (CAAS), produced by theNational Center for Higher EducationManagement Systems (NCHEMS), isavailable for this purpose. Alumni sur-veys also are offered as a part of severalcomprehensive survey programs, whichare described in the next section.

More recently, Peterson’s, the pub-lisher of college guidebooks and aprovider of web-based college search ser-vices, has incorporated into its servicesthe College Results Survey (previouslyknown as the Collegiate ResultsInstrument). This instrument uses real-life scenarios to assess alumni percep-tions of their preparedness for their jobs,community participation, and civicresponsibilities. Several institutions par-ticipated in a comprehensive pilot of thisinstrument. Anyone who accessesPeterson’s web site now can completethis survey. Visitors self-select their almamater and then complete a four-sectionsurvey. Peterson’s has not analyzed orreported yet on these unverifiedresponses, and will determine how bestto deploy this assessment in consultationwith institutions.

Tracking changes in student attitudes and behaviorsMany of the organizations already men-tioned offer multiple instruments for dif-ferent populations, such as entering andcontinuing student surveys. However,several assessment programs provide aseries of instruments that use commonquestions to help assess changes in stu-dent attitudes and behaviors over time.These survey programs are often appeal-ing to colleges and universities that seekto implement a comprehensive range ofsurvey assessments.

The Student Outcomes InformationSystem (SOIS), offered by NCHEMS,includes surveys of entering, continuing,and noncontinuing students, and recentand previous graduates. ACT offers a setof 15 standardized instruments for col-leges and universities, under the nameEvaluation and Survey Services (ESS),which includes surveys for entering, con-tinuing, noncontinuing, and graduatedstudents. The ACT program includesspecific instruments for assessing func-tions such as academic advising and forassessing the needs of both traditionaland adult learners.

Faculty and other constituent views ofinstitutional programsThe last section of each table in theappendix lists several instruments thatassess institutional quality by solicitingthe views of faculty and other con-stituents about the campus climate forliving, working, and learning. HERI’sFaculty Survey explores the attitudes andopinions of college and university facultyabout their work environment. TheNoel-Levitz Institutional PrioritiesSurvey (IPS) enables an institution toexplore similarities and differences

Page 12: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 9

among the priorities of various con-stituents, including students, faculty,and staff. NCHEMS offers an instru-ment, in both two- and four-year institu-tion versions, that asks faculty, students,and staff to answer similar sets of ques-tions about institutional performanceand effectiveness. Finally, ETS offers theProgram Self-Assessment Service, inboth undergraduate and graduate ver-sions, which assists academic programsengaging in a self-guided review.

The set of instruments and servicesreviewed in this section is by no meansan exhaustive representation of all theassessments currently available. Theyoffer a sampling of the various tools thatcollege and university faculty and admin-istrators can use in their assessment andaccountability efforts. Too often, institu-tions see the results of one of theseassessments as the endpoint of the pro-cess. The next section considers the ques-tions and issues that determine whethersuch assessment efforts provide usefulinformation for planning, evaluation,and decision-making processes that pro-mote accountability and improvement.

Page 13: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

Using Assessment Results Effectively

How well do assessments reflect student experiences on our campus?

Before acting upon the results of any assessment, it is important to understand how

well the results reflect actual student experiences. Answering this question requires

institutions to examine several technical issues, including the representativeness of

the sample (what is the response rate and response bias?), the reliability of the

instrument (would it yield the same results if administered to a different but equally

representative group?), and the validity of the instrument (does it actually measure

what it purports to measure?). For several reasons, an assessment instrument cre-

ated by a nationally recognized organization likely will be more reliable and valid

than one developed locally. However, measurement in higher education and in the

social sciences is by no means exact.

The assessment instruments that this guide describes have a respectable level of

reliability, but reliability is the easiest measurement characteristic to achieve.

Sampling representativeness is entirely related to how the instrument is adminis-

tered, which in many cases the institution partly or entirely determines. Validity is

the thorniest issue. It encompasses questions related to the simplicity or complexity

of what is being measured (for example, where students live while attending college

versus their engagement in the academic community), as well as how well students’

recollections reflect their actual experiences.

Validity also relates to how institutions interpret assessment results. It is unclear

whether students’ responses to questions about college experiences reveal more

about student or institutional differences. For example, business majors’ responses

to questions about required general education courses may differ from those of

American Council on Education/Association for Institutional Research 11

Page 14: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

12 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

liberal arts majors taking the samecourses at the same institution.Therefore, the institutional differencesreflected through these assessments may well reflect differences in studentprofiles rather than differences in thequality of institutional programs and services.

For this reason, it is important to con-sider how comparative institutionalbenchmarks are generated; the assess-ment instruments and services describedin this guide vary greatly in this regard.Some offer only global national compar-isons. Others allow survey administratorsto select comparison groups accordingto general institutional characteristics,such as selectiveness or Carnegie classi-fication type. Still others allow for moreflexible choices, including designating acomparison group among a minimumset of institutions (e.g., at least eight)from among all participants.

Despite all the limitations presentedby issues of reliability, sample represen-tativeness, and validity, the results ofthese assessments still can be quite use-ful for internal improvements and external accountability. But campusadministrators need to understand theselimitations to make informed decisions.The next two sections offer some ways tomaximize these uses.

How can we use the data for assessmentand improvement?Several voices need to come together to ensure that institutions can put theresults of these assessments to good use.As mentioned previously, those who arein a position to impact the quality of relevant institutional programs and pro-cesses must be at the table when choosingassessment instruments. The results of

the assessments must be shared withthese same individuals. Given the techni-cal issues raised in the previous section,it is equally important to involve individ-uals who understand the technical andcontextual limitations of such assess-ments, such as institutional researchersor faculty with expertise in assessment orsurvey research.

The respondents also can help institu-tions use assessment results effectively.Discussions with relevant student,alumni, and faculty groups often canprovide keen insights into how questionswere interpreted and, therefore, whatthe results may mean. Respondents’interpretations of results provide anadditional perspective that helps theinformation user further understand thecontext and limitations of the results.

It is important to consider carefullywhere and at what level assessmentresults will be used before engaging inthe effort. Given the size and variabilityof many college student bodies, dataoften must be disaggregated into mean-ingful subgroups based on characteris-tics such as class level or major beforethey can be valuable for programimprovement. Survey administratorsmust draw samples in a way that allowsgeneralization to the subgroups forwhich results are desired. Unfortunately,comparative institutional data are notalways available for subgroups that maybe meaningful to a particular campus.

Most of the national assessmentsintroduced in this guide make provisionsfor including local items in the surveyadministration, thereby customizing theinstrument for a particular campus orgroup of institutions. However, campusofficials must consider some importantlimitations. The number of local items is

Page 15: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 13

often limited. Furthermore, adding tothe length of any assessment instrumentdetracts from the response rate andresponse quality. Response formats forlocal items usually are limited to five-point Likert-type scales (for example,strongly disagree, disagree, neutral,agree, and strongly agree). The institu-tion may be able to vary the responsescale, but doing so often detracts fromthe reliability of the instrument.

How can campuses, governing boards,policy makers, or other constituents usethe results for public accountability?With all the inherent limitations of anyparticular assessment, college and uni-versity presidents must concern them-selves with who receives the results ofthese assessments and how their institu-tion packages and disseminates theseresults to external audiences. The datafrom most of the instruments describedin this guide are considered the propertyof the institution. As such, the adminis-trators, faculty, and staff of the institu-tion control the way the information ispackaged and presented.

Several of the instruments discussedin this guide were designed with publicaccountability as a primary purpose.One explicit goal of the NSSE is toimpact national rankings of colleges anduniversities. Peterson’s College ResultsSurvey, which students and alumni cancomplete outside the control of theirinstitution, may become a centerpiece ofPeterson’s consumer information ser-vices. An increasing number of state anduniversity systems are using commonassessment surveys to benchmark insti-tutional effectiveness.

Because of this trend, the use and control of results from some of these

national assessments for public account-ability may become more complicated in the future. In the short run, theseinstruments provide a valuable source ofcampus-level accountability informationfor governing boards, public constituen-cies, and accreditation agencies. In thelong run, the use of these and othertypes of assessments for internal plan-ning and improvement may be the bestsupport for external accountability.Colleges and universities that aggres-sively evaluate their programs and services—and act on that information toimprove those programs and services—will gather a rich body of evidence tosupport their claims of institutionaleffectiveness.

What are the advantages and disadvantages of using national surveys,compared to local instruments?National surveys are a relatively cost-effective way to gather assessment infor-mation. The organizations that developthese instruments often devote greatertechnical and financial resources thancan an individual institution. They alsotest these instruments on a broader pop-ulation than can a single institution. Theavailability of comparative data fromother institutional participants is animportant advantage to using a nationalassessment, but, as mentioned earlier,comparative benchmarks may be oflimited use if the comparison group doesnot include institutions with studentprofiles similar to the target institutionor if student profile differences other-wise are not taken into account.Moreover, comparative data often arenot available at disaggregate levels,where results can be most potent.

Local assessment instruments providemuch greater control and attention to

Page 16: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

local issues, and survey administratorscan more closely integrate such instru-ments across populations and samples. A local assessment of decent quality typi-cally costs more to administer than anational assessment but often yieldsresults that are more directly applicableto the sponsoring campus. The lack ofcomparative data is a major limitation,but institutions working in data-sharingconsortia can circumvent this limitationby introducing common items and meth-ods among local instruments.

Developing quality local assessmentsrequires a significant commitment oftime and resources that some collegesand universities cannot afford; however,effective use of national assessmentsrequires a similar significant commit-ment. The reports produced by the ser-vice providers are informative and havesome direct use, but far more use comesfrom local follow-up analyses thataddress issues of immediate concern toindividuals and groups working onimprovement efforts.

Do we need to use the human subjectsreview process when administering one ofthese assessments?The use of human subjects in assessmentresearch is coming under increasedscrutiny from two sources. In recentyears, federal agencies that monitor theuse of human subjects in research havetightened the enforcement of their regu-lations. The privacy of student and fac-ulty records also is garnering increasedattention among both state and federalagencies, as guided by the FamilyEducation Rights and Privacy Act(FERPA).

Technically, all research on humansubjects requires review by a sanctioned

human subjects review board.Institutional and program evaluationefforts are, in themselves, not consideredresearch. Many college and universityreview boards define research accordingto the use of the results. Once faculty andstaff prepare information for dissemina-tion to an audience external to the insti-tution, it becomes research; however,this does not always require that the ini-tial data collection effort be reviewed.Faculty and staff who prepare articles orpresentations to professional organiza-tions that use data from institutionalassessments should consult with theirlocal review boards about the proper pro-cedures to follow.

Regardless of whether assessmentresults are disseminated beyond theinstitution, survey administrators shouldtreat data collected through national orlocal assessment instruments as confi-dential student records. Many collegesand universities develop specific policiesregarding the use and storage of studentand faculty data. In some cases, institu-tional policies need to be revisited whenconsidering assessment data, since thesedata usually are not stored in the sameinformation systems environment asoperational student and faculty data.

How do these national assessments compare with other ways to assess institutional quality?Assessments of institutional quality can take a variety of forms, includingclassroom-based activities, student per-formance assessments, program reviews,focus groups, exiting senior interviews,and so forth. The national surveys, stan-dardized tests, and other assessment services considered in this guide can bean important part of these assessments.Colleges and universities that have

14 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

Page 17: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 15

well-developed assessment programstypically use a variety of assessments,including but not limited to nationalinstruments. No single method offersthe best approach. Each has benefits andlimitations, which is why a comprehen-sive assessment program usuallyincludes multiple, convergent methods.

For institutions that do not have well-established assessment programs, thenational surveys described in this guidecan be catalysts for further development.Specific findings can lead to action.Institutions can use perceived limitationsas points of departure to identify thefocus of future assessments.

Do these assessments encompass theviews of all major student and other constituent groups?Most of the instruments reviewed in thisguide draw information about the col-lege student experience directly fromstudents: new, continuing, and gradu-ated. A few instruments gather input oninstitutional quality from faculty, admin-istrators, and staff. One voice that iscompletely absent from these assess-ments is that of the individuals and

organizations who interact with and hirecollege and university graduates:employers, graduate school administra-tors, social agencies, and so forth.Several specialized accrediting agenciesrequire college program administratorsto survey employers, but there is, as ofyet, no popular national instrument.

Many of the popular surveys of thecollege student experience (for example,the CIRP Freshman Survey, CSEQ, andNSSE) originate from research on traditional-aged students attending resi-dential universities. Some instrumentsare tailored for two-year colleges, such asthe CCSEQ and Faces of the Future.Among the 15 instruments included inACT’s Evaluation and Survey Servicesprogram is an assessment designed toevaluate the experiences of adult learn-ers (for example, nontraditional-agedand commuter students). Noel-Levitz’snew Adult Student Priorities Survey(ASPS) also is tailored toward this group.However, the research base still is some-what skewed toward the definitions ofquality that emerge from the traditionalcollege experience.

Page 18: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 17

Conclusion

The American system of higher education is characterized by institutional diversity,

and the increasing variety of national assessments is only beginning to reflect this

diversity. As colleges and universities respond to the increasing demand for public

accountability and consumer information, we hope to see an attendant increase in

the number of assessment instruments that better reflect the complete range of

college and university missions and clientele. College and university presidents can

contribute to this end by initiating discussions on their campuses and among similar

types of institutions about the kinds of evidence that would best reflect the quality of

their institutions. The most exemplary local and consortia efforts to assess quality in

terms of institutional mission likely will exert the strongest influence on the future

developments of national assessments.

Page 19: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

18 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 1. Instrument,Administrator, Purpose, Use of Data,History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

Freshman Class Profile Service

American College Testing (ACT)

Summarizes the characteristics of ACT-tested enrolled and nonenrolled studentsby institution.

Institutional marketing and recruit-ment: knowledge of competitors, characteristics of enrolled students,feeder high schools, etc.

Student Descriptive Questionnaire(SDQ)

The College Board

Provides a basic profile of students whotook the SAT.

Admissions and recruitment; institu-tional research and assessment; reten-tion studies.

Admitted Student Questionnaire(ASQ) and Admitted StudentQuestionnaire Plus (ASQ Plus)

The College Board

Studies students’ perceptions of theirinstitution and its admissions process.Facilitates competitor and overlap com-parisons.

Recruitment; understanding of marketposition; evaluation of institutionalimage; calculation of overlap win/loss;evaluation of financial aid packaging.

College Student ExpectationsQuestionnaire (CSXQ)

Center for Postsecondary Researchand Planning (CPRP) at IndianaUniversity

Assesses new students’ expectationsupon matriculation. Findings can becompared with student reports of theiractual experiences as measured by theCollege Student ExperiencesQuestionnaire (CSEQ).

Comparison with CSEQ data to identifyareas where the first-year experiencecan be improved. Also can be used for campus research and assessmentinitiatives.

ENROLLED UNDERGRADUATES

College Student Survey (CSS)

HERI

Evaluates students’ experiences and sat-isfaction to assess how students havechanged since entering college. Can beused longitudinally with the CIRPFreshman Survey.

Student assessment activities; accredi-tation and self-study; campus planning;policy analysis; retention analysis; andstudy of other campus issues.

Cooperative Institutional Research Program (CIRP) Freshman Survey/Entering Student Survey (ESS)

Higher Education Research Institute(HERI) at UCLA and American Council on Education (ACE)

Collects demographic and attitudinalinformation on incoming students.Serves as baseline for longitudinal follow-up. Measures trends in highereducation and characteristics ofAmerican college freshmen.

Admissions and recruitment; academicprogram development and review; self-study and accreditation; public rela-tions and development; institutionalresearch and assessment; retentionstudies; longitudinal research about the impacts of policies and programs.

ENTERING UNDERGRADUATES

Page 20: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 19

Established in 1966 at ACE, the CIRPwas transferred to HERI at UCLA in1973.

Demographic characteristics; expecta-tions of the college experience; sec-ondary school experiences; degree goalsand career plans; college finances; atti-tudes, values, and life goals; reasons forattending college.

HISTORY INFORMATION COLLECTED

Student Profile Section (SPS) is a set of 189 items included in the ACTAssessment Program. Certain itemsare updated every few years for cur-rency (e.g., racial/ethnic categories).The current format of the SPS wasdeveloped in 1973.

Demographics: background informa-tion; high school characteristics andevaluation; needs assessment; careerinterests; college plans; and achievementtest scores.

ENTERING UNDERGRADUATES

Not available. Prior academic record; high schoolcourse-taking patterns; student demo-graphics; and family background.

ASQ was developed in 1987, ASQ Plusin 1991.

Student assessment of programs, admis-sions procedures, literature, institutionalimage; financial aid packages; commonacceptances; comparative evaluations.ASQ Plus provides specific institutionalcomparisons.

Originally developed in 1997 for aFIPSE-funded project, CSXQ is anabbreviated version of CSEQ. Secondedition available since 1998.

Background information; expectationsfor involvement in college activities; pre-dicted satisfaction with college; andexpected nature of college learning envi-ronments.

ENROLLED UNDERGRADUATES

The CSS was initiated in 1993 to per-mit individual campuses to surveyundergraduates at any level and toconduct follow-up studies of theirCIRP Freshman Survey respondents.

Satisfaction with college experience; stu-dent involvement; cognitive and affectivedevelopment; student values, attitudes,and goals; degree aspirations and careerplans; Internet, e-mail, and other com-puter uses.

Page 21: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

20 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 1 (continued). Instrument,Administrator, Purpose, Use of Data, History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

College Student ExperiencesQuestionnaire (CSEQ)

CPRP

Measures quality of students’ experi-ences inside and outside the classroom,perceptions of environment, satisfaction,and progress toward 25 desired learningand personal development outcomes.

Outcomes of college; accreditationreview; institutional research, evalua-tion, and assessment; student recruit-ment and retention; assessment ofundergraduate education.

Community College StudentExperiences Questionnaire(CCSEQ)

University of Memphis, Center for theStudy of Higher Education

Measures students’ progress and experi-ences.

Self-study and accreditation review;assessment of institutional effectiveness;evaluation of general education, transfer,and vocational programs; use of technol-ogy; measurement of student interest,impressions, and satisfaction.

National Survey of StudentEngagement (NSSE)

CPRP

Gathers outcomes assessment, under-graduate quality, and accountabilitydata. Measures students’ engagement in effective educational practices (level of challenge, active learning, student-faculty interaction, supportive environment, etc.).

Institutional improvement and bench-marking; monitoring of progress overtime; self-studies and accreditation; andother private and public accountabilityefforts.

Your First College Year (YFCY)

HERI and Policy Center on the FirstYear of College at Brevard College

Designed as a follow-up survey to theCIRP Freshman Survey. Assesses studentdevelopment during the first year of college.

Admissions and recruitment; academicprogram development and review; self-study and accreditation; public rela-tions and development; institutionalresearch and assessment; retentionstudies; longitudinal research; first-year curriculum efforts.

Student Satisfaction Inventory(SSI)

Noel-Levitz

Measures students’ satisfaction. Student retention; student recruitment;strategic planning and institutional effectiveness.

Faces of the Future

American Association ofCommunity Colleges (AACC) and ACT

Assesses the current state of the commu-nity college population and explores therole community colleges play in stu-dents’ lives.

Community college students; bench-marking; comparisons to national data;tracking of trends in student population.

Adult Student Priorities Survey(ASPS)

Noel-Levitz

Measures satisfaction of students age 25and older.

Student retention; student recruitment;strategic planning and institutional effectiveness.

ENROLLED UNDERGRADUATES

Page 22: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 21

Developed in 1998 and piloted in early1999, the survey is now in its thirdyear of administration.

Background information (general,employment, education); current collegeexperiences (access and purpose, learn-ing and satisfaction, expected outcomeand intent, transitions).

HISTORY INFORMATION COLLECTED

Developed by C. Robert Pace in the1970s, CSEQ is in its fourth edition(second edition, 1983; third edition,1990). Since 1994 George Kuh(Indiana University) has directed theresearch program.

Background information; level of stu-dent engagement in learning activities;student ratings of college learning envi-ronment; estimate of student gainstoward learning goals; index of studentsatisfaction with the college.

ENROLLED UNDERGRADUATES

Co-authored by Jack Friedlander, C.Robert Pace, Patricia H. Murrell, andPenny Lehman (1991, revised 1999).

Amount, breadth, and quality of effortexpended in both in-class and out-of-class experiences; progress toward edu-cational outcomes; satisfaction withcommunity college environment; demo-graphic and background characteristics.

Designed in 1998 by a group of assess-ment experts chaired by Peter Ewell,NCHEMS. Project director is GeorgeKuh, Indiana University.

Student reports of quality of effort insideand outside the classroom, includingtime devoted to various activities andamount of reading and writing, higherorder thinking skills, quality of interac-tions, educational and personal gains,and satisfaction.

Administered by HERI in partnershipwith the Policy Center on the FirstYear of College, and funded by ThePew Charitable Trusts, YFCY waspilot-tested in spring 2000. Secondpilot is scheduled for spring 2001. Fulladministration will begin in 2002.

One-third of items are CIRP post-testitems. Remaining questions address stu-dents’ academic, residential, and employ-ment experiences; self-concept and lifegoals; patterns of peer and faculty inter-action; adjustment and persistence;degree aspirations; and satisfaction.

SSI was piloted in 1993 and becameavailable to institutions in 1994.

Ratings on importance of and satisfac-tion with various aspects of campus. The survey covers most aspects of stu-dent experience.

ASPS was piloted and became avail-able to institutions in 2000.

Ratings on importance of and satisfac-tion with various aspects of campus. The survey is specific to the experienceof adult students.

Page 23: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

22 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 1 (continued). Instrument,Administrator, Purpose, Use of Data, History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

Academic Profile

Educational Testing Service (ETS) andThe College Board

Assesses college-level general educationskills.

Describe performance of individuals andgroups; measure growth in learning; usedata as a guidance tool and performancestandard.

Tasks in Critical Thinking

ETS

Assesses proficiency in college-level,higher order thinking skills.

Each student receives a confidentialreport on skills performance. Data canhelp institution learn more about teach-ing and program effectiveness.

Major Field Tests

ETS

Assesses students’ academic achieve-ment in major field of study.

Measure student academic achievementand growth and assess effectiveness ofdepartmental curricula for planning anddevelopment.

Area Concentration AchievementTests (ACAT)

Project for Area ConcentrationAchievement Testing (PACAT) atAustin Peay State University

Assesses outcomes and provides curriculum-specific feedback on student achievement.

Provide specific program analysis.

Comprehensive AlumniAssessment Survey (CAAS)

National Center for Higher EducationManagement Systems (NCHEMS)

Measures evidence of institutional effec-tiveness and reports on alumni personaldevelopment and career preparation.

Help clarify institutional mission andgoals and assist in developing new goals.Evaluate mission attainment and impactof general education programs, corerequirements, and academic support services.

Collegiate Assessment of AcademicProficiency (CAAP)

ACT

Assesses college students’ academicachievement in general education skills.

Document levels of proficiency; comparelocal populations via user norms; estab-lish eligibility requirements; report edu-cational outcomes for accountability andaccreditation; improve teaching; andenhance student learning.

College Results Survey (CRS)

Peterson’s, a Thomson LearningCompany

Identifies personal values, abilities,occupations, work skills, and participa-tion in lifelong learning of college gradu-ates. Uses alumni responses to establisha unique institutional profile.

Peterson’s uses data collected online for consumer information athttp://www.bestcollegepicks.com.Institutions use data collected in collabo-ration with Peterson’s for self-study.

ALUMNI

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Page 24: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 23

CAAP was introduced in 1988. Assessment of proficiency in core gen-eral education skills, including writing(objective and essay), reading, math, science reasoning, and critical thinking.

HISTORY INFORMATION COLLECTED

Introduced in 1992 to assist institu-tions with accreditation, account-ability, and program improvement.

Norm-referenced and criterion-referenced scores measure college-levelreading, college-level writing, criticalthinking, and mathematics.

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Introduced in 1992 to address theneed to assess college-level higherorder thinking skills and to improveteaching and learning.

Measures college-level inquiry, analysis,and communication skills.

These tests originally were based onthe GRE subject tests and are jointlysponsored by ETS and the GRE Board.Each test is periodically updated tomaintain currency with standardundergraduate curricula.

Factual knowledge; ability to analyze andsolve problems; ability to understandrelationships; and ability to interpretmaterial. Available for 15 disciplines; see www.ets.org/hea for listing.

Established in 1983 and expanded in1988 by a FIPSE grant, ACAT is anationally normed instrument withitems written by faculty in the variousdisciplines.

Discipline-specific surveys cover agricul-ture, biology, criminal justice, geology,history, neuroscience, political science,psychology, art, English literature, andsocial work.

Not available. Employment and continuing education;undergraduate experience; developmentof intellect; achievement of communitygoals; personal development and enrich-ment; community participation; demo-graphic and background information.

Formerly the College ResultsInstrument, CRS was developed byRobert Zemsky at the University ofPennsylvania with support from theU.S. Department of Education and theKnight Higher Education Collaborative.

Lifelong learning; personal values; confi-dence; occupation and income; and workskills.

ALUMNI

Page 25: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

24 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 1 (continued). Instrument,Administrator, Purpose, Use of Data, History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

Evaluation/Survey Services

ACT

Assesses needs, development, attitudes,and opinions of students and alumni.

Accreditation; program and serviceassessment; outcomes assessment; retention; alumni follow-up; institu-tional self-study.

Faculty Survey

HERI

Collects information about the work-load, teaching practices, job satisfaction,and professional activities of collegiatefaculty and administrators.

Accreditation and self-study reports;campus planning and policy analysis; faculty development programs; bench-marking faculty characteristics.

Institutional Performance Survey(IPS)

NCHEMS

Assesses institutional performance andeffectiveness.

Self-study; marketing.

Institutional Priorities Survey(IPS)

Noel-Levitz

Assesses faculty, staff, and administrativeperceptions and priorities (recom-mended with the SSI to determine wherepriorities overlap with those of students).

Student retention; student recruitment;strategic planning and institutional effectiveness. Institutions can pinpointareas of consensus on campus.

Program Self-Assessment Service(PSAS) and Graduate ProgramSelf-Assessment Service (GPSAS)

ETS

Assesses students’ opinions on under-graduate and graduate programs.

Used by departments for self-study and asadditional indicators of program qualityfor accreditation purposes.

Student Outcomes InformationSurvey (SOIS)

NCHEMS

Collects information about students’needs and reactions to their educationalexperiences.

Longitudinal assessment of students’experiences and opinions.

FACULTY AND INSTITUTIONAL SURVEYS

SERIES OF INSTRUMENTS

Page 26: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 25

In use since 1978. Background, personal goals, and careeraspirations; factors influencing collegechoice; satisfaction with college experi-ence; activities while in college; educa-tional plans and accomplishments;career choices; career successes.

HISTORY INFORMATION COLLECTED

Established in 1979. Fifteen standardized instruments includealumni surveys, outcomes assessmentsurveys, satisfaction surveys, opinion surveys, entering student surveys, andnonreturning student surveys. Seewww.act.org/ess/index.html for complete list of available surveys.

SERIES OF INSTRUMENTS

In seven faculty surveys conductedsince 1969, HERI has collected dataon more than 500,000 college facultyat more than 1,000 institutions. Thenext faculty survey is scheduled for2001–02.

Background characteristics; teachingpractices and research activities; interac-tions with students and colleagues; professional activities; faculty attitudesand values; perceptions of the institu-tional climate; job satisfaction.

IPS is a by-product of a nationalresearch study to assess how variousinstitutional conditions are related tothe external environment, strategiccompetence, and effectiveness.

More than 100 items measure eightdimensions of institutional performance.

IPS was developed as a parallel instru-ment to the Noel-Levitz SSI. IPS waspiloted and made available in 1997.

Perceptions on the importance of meet-ing various student expectations, andtheir level of agreement that institutionactually is meeting these expectations.

GPSAS was developed in conjunctionwith the Council of Graduate Schoolsin the 1970s. PSAS was developed inthe 1980s using GPSAS as a model.

Quality of teaching; scholarly excellence;faculty concern for students; curriculum;students’ satisfaction with programs;resource accessibility; employment assis-tance; faculty involvement; departmentalprocedures; learning environment.

FACULTY AND INSTITUTIONAL SURVEYS

Page 27: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

26 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 2. Target Institutions and Samples, Participation,Format,Administration Procedure, and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

Freshman Class Profile Service

American College Testing (ACT)

All types/All ACT test-takers. Over 1 million high school students aretested each year. This service includesmore than 550,000 enrolled studentsfrom 900 institutions each year.

Student Descriptive Questionnaire(SDQ)

The College Board

All types/All SAT test-takers. All students who participate in the SATcomplete the SDQ. Responses only sentif student indicates ‘Yes’ to beingincluded in Student Search Service.

Admitted Student Questionnaire(ASQ) and Admitted StudentQuestionnaire Plus (ASQ Plus)

The College Board

All types/All admitted students. Every year, 220 institutions participateand 400,000 students are surveyed.

College Student ExpectationsQuestionnaire (CSXQ)

Center for Postsecondary Researchand Planning (CPRP) at IndianaUniversity

Four-year public and private institu-tions/ Incoming students.

More than 33,000 students at twodozen different types of colleges anduniversities participate.

ENROLLED UNDERGRADUATES

College Student Survey (CSS)

HERI

All types/All students. CSS has collected data from more than230,000 students at 750 institutions.

Cooperative Institutional ResearchProgram (CIRP) FreshmanSurvey/Entering Student Survey(ESS)

Higher Education Research Institute(HERI) at UCLA and AmericanCouncil on Education (ACE)

All types/Incoming students (ESS specif-ically designed for two-year institutions).

Since 1966, 1,700 institutions and 10million students have participated. In fall 2000, 717 institutions and 404,000students participated.

ENTERING UNDERGRADUATES

Page 28: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 27

Four-page paper survey. Colleges order surveys from HERI,administer surveys on campus, andreturn completed surveys for processing.Most campuses administer survey inproctored groups.

Register for survey in the spring.Surveys are administered in the sum-mer and fall, usually during orienta-tion. Report available in December.

FORMAT ADMINISTRATION PROCEDURE TIMELINE

Responses are collected via paper aspart of the registration materials forthe ACT Assessment. They are laterelectronically combined with assess-ment results for reporting andresearch purposes.

Students complete this when registeringfor the ACT. Responses and ACT scoresare sent to schools and institutions.

Institutions register in July of each year.Enrollment information is sent to ACTfrom September through June; reportsare produced within 30 to 60 days.

ENTERING UNDERGRADUATES

This paper-and-pencil instrument is completed as part of the test regis-tration process.

Students complete questionnaire prior totaking the SAT. Responses and SATscores are sent to schools.

Tapes and/or diskettes are sent to insti-tutions six times per year as part of SATTest reports.

Each program has matriculating andnonmatriculating student version of astandardized paper survey. Optionalweb version also is available.

Colleges administer and collect surveys,then send them to The College Board forprocessing. ASQ Plus asks colleges toidentify their major competitors, and students rate their college choice vs.other top choices.

Institutions determine when to mail surveys, but The College Board recom-mends that they do so as soon as theyknow who will enroll (usually mid- to lateMay). Follow-up strongly recommended.

Four-page paper survey; web versionunder development. Takes 10 minutesto complete. Demo version atwww.indiana.edu/~cseq.

Institutions administer surveys andreturn completed instruments to CPRPfor processing. Web version is adminis-tered via a server at Indiana University;institutions provide student contactinformation.

Most institutions administer the surveyduring fall orientation. To compare stu-dent expectations with actual experi-ences, colleges administer the CSEQ tothe same students the following spring.

ENROLLED UNDERGRADUATES

Four-page paper survey. Campuses administer surveys and returnthem to data processing center. Campusesmay choose to survey students who completed the CIRP for the purposes oflongitudinal study.

Register January 1 or May 1. Two admin-istration periods available: Januarythrough June and July throughDecember. Reports from first periodavailable in fall, from second period inFebruary of subsequent year.

Page 29: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

28 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 2 (continued). Target Institutions and Samples, Participation, Format,Administration Procedure,and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

Community College StudentExperiences Questionnaire(CCSEQ)

University of Memphis, Center for theStudy of Higher Education

Community colleges/All students. The 1991 edition collected data from45,823 students at 57 institutions. The1999 edition collected data from18,483 students at 40 institutions.

Faces of the Future

American Association of CommunityColleges (AACC) and ACT

Community colleges/All students (credit and noncredit).

In fall 1999, more than 100,000 stu-dents at 250 institutions participatedin the survey.

National Survey of StudentEngagement (NSSE)

CPRP

Four-year public and private institu-tions/First-year and senior students.

After 1999 field test, the first nationaladministration was in spring 2000 with195,000 students at 276 institutions.CPRP annually surveys approximately200,000 students at 275 to 325 collegesand universities.

Your First College Year (YFCY)

HERI and Policy Center on the FirstYear of College at Brevard College

All types/Students near the end of thefirst year of college.

Total of 58 institutions and 19,000 first-year students will participate in spring2001 pilot. Participation expected to beopen to all institutions in spring 2002.

Student Satisfaction Inventory(SSI)

Noel-Levitz

All types (four-year, two-year, and careerschool versions are available)/All stu-dents.

SSI is used by more than 1,200 collegesand universities. More than 800,000 student records are in the nationaldatabase.

College Student ExperiencesQuestionnaire (CSEQ)

CPRP

Four-year public and private institu-tions/All students.

More than 500 colleges and universi-ties and approximately 250,000 stu-dents since 1983 (when second editionwas published) have participated.

Adult Student Priorities Survey(ASPS)

Noel-Levitz

All types/All students 25 years and older. ASPS was piloted by more than 30 insti-tutions and more than 4,000 students inspring 2000.

ENROLLED UNDERGRADUATES

Page 30: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 29

Eight-page paper survey; identicalweb survey also available. Takes 20 to25 minutes to complete. Demo ver-sion at www.indiana.edu/~cseq.

Institutions administer surveys andreturn completed instruments to CPRPfor processing. Web version is adminis-tered via a server at Indiana University;institutions provide student contactinformation.

Most institutions administer the surveyat the mid-point or later in the springterm so that students have enoughexperience on campus to provide valid,reliable judgments. For research pur-poses, the CSEQ also can be adminis-tered at other times.

FORMAT ADMINISTRATION PROCEDURE TIMELINE

Paper survey, self-report (Likertscale).

Instruments can be mailed to students ordistributed in classes, through studentorganizations, or other student assem-blies. Completion of instrument takes 20to 30 minutes.

The Center provides surveys uponreceipt of an order. Scoring is completedand results are mailed two to three weeksafter colleges return instruments to theCenter.

ENROLLED UNDERGRADUATES

Paper survey. Colleges order materials from ACT andadminister surveys on campus.Completed surveys are returned to ACTfor scoring and processing.

Surveys can be administered duringAACC/ACT fall administration(October) for a reduced cost, or can be administered at other times at regular cost.

Students can complete either a four-page paper survey or the identicalonline version. Students at one-fifth ofparticipating schools complete theweb survey. Demo version atwww.indiana.edu/~nsse.

Schools send student data files, letter-head, and invitation letters to CPRP,which handles data collection, includingrandom sampling, sending surveys tostudents, and conducting follow-ups.Students return surveys to CPRP.

Institutions send data files to CPRP inlate fall. Surveys are mailed to studentsin late winter and early spring. Follow-ups continue through the spring. CPRPsends institutional reports and data toschools in late summer.

Four-page paper survey; web surveyalso available.

HERI oversees administration of paperor web-based survey instrument; stu-dents return completed survey forms todata processing center.

Institutions register for survey in thefall and administer survey in thespring. Reports are available in latesummer.

Paper survey. In spring 2001, the sur-vey also will be available on the web.

SSI is generally administered in a class-room setting and takes 25 to 30 minutes.Web version takes 15 to 20 minutes. TheURL is e-mailed to students along with aspecific student numeric password toenter the survey area.

Students can complete the survey any-time during the academic year. Surveysgenerally arrive on campus within oneweek of ordering. Institutions send com-pleted surveys to Noel-Levitz for process-ing. Reports are ready for shipment in 12to 15 business days.

Paper survey. In spring 2001, the sur-vey will be available on the web.

ASPS is administered in a classroom set-ting and takes 25 to 30 minutes. Webcompletion takes students 15 to 20 min-utes. For the web version, the URL andpassword are e-mailed to students.

Students can complete the survey any-time during the academic year. Surveysgenerally arrive on campus within oneweek of ordering. Institutions sendcompleted surveys to Noel-Levitz forprocessing. Reports are ready for ship-ment in 12 to 15 business days.

Page 31: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

30 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 2 (continued). Target Institutions and Samples, Participation, Format,Administration Procedure,and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

Academic Profile

Educational Testing Service (ETS) and The College Board

All types/All students. This survey has been used by 375 insti-tutions and 1 million students.

Tasks in Critical Thinking

ETS

All types/All students. This instrument is administered by 35institutions to 200 to 500 students ateach institution.

Major Field Tests

ETS

Four-year colleges anduniversities/Senior students.

In the 1999–2000 academic year, morethan 1,000 departments from 606 highereducation institutions administerednearly 70,000 tests. Current nationalcomparative data include accumulatedscores from 96,802 seniors.

Area Concentration AchievementTests (ACAT)

Project for Area ConcentrationAchievement Testing (PACAT) atAustin Peay State University

Two- and four-year public and privateinstitutions/Generally seniors, althoughACAT can serve as a pre-test.

Approximately 300 institutions andmore than 50,000 students have partici-pated.

Comprehensive AlumniAssessment Survey (CAAS)

NCHEMS

All types (two-year and four-year versionsavailable)/Alumni.

Information not available.

Collegiate Assessment of AcademicProficiency (CAAP)

ACT

All types/All students. More than 600 institutions have usedCAAP since 1988. More than 450,000students have tested between 1998 and2000.

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

ALUMNI

College Results Survey (CRS)

Peterson’s, a Thomson LearningCompany

Bachelor degree—granting institutions/Alumni, preferably four to 10 years following degree attainment.Recommended sample size is 2,000.

The pilot study included 80 institutionsand 40,000 instruments. The web-basedsurvey is open to any graduate. There isno limit on the number of participants.

Page 32: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 31

Demographic questions collected onpaper with assessment battery. Usersmay add up to nine additional items;they also may design their own assess-ment test battery by choosing from thesix different skill modules.

Colleges order assessment battery fromACT, administer it during a locally deter-mined two-week test period, and returnit to ACT for processing.

Flexible administration schedule. Eachassessment module can be administeredwithin a 50-minute class period.Institutions must order assessments atleast two weeks prior to administrationperiod.

FORMAT ADMINISTRATION PROCEDURE TIMELINE

Paper survey (long and short forms).Long form contains 108 multiple-choice questions and takes 100 min-utes. Short form contains 36questions. Optional essay is available.

Colleges order materials from ETS andadminister them to students. Collegesreturn tests to ETS for scoring.

Institutions administer tests on theirown timeline. Tests are scored weekly,and reports are issued approximatelythree weeks after ETS receives tests.

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Open-ended or performance-based 90-minute “tasks” in humanities, socialsciences, or natural sciences. Thescore range for each skill is 1 to 6, with4 as the core score.

Colleges order materials from ETS andadminister them to students. ETS trainsfaculty to score students’ responses, orETS scores the tasks. There are nine separate tasks; three tasks can be usedfor assessing fewer than 100 students.

Colleges decide who and when to test.Faculty decides scoring schedule or ETSprovides a three- to four-weekturnaround for issuing a report.

Paper-and-pencil test. Institutions order tests, administer themonsite to students, and return them toETS for processing.

Must order three to four weeks prior toadministration for standard shipping.Answer sheets received by the beginningof each month are scored that month (no scoring in January or September).Reports are mailed three weeks afterscoring.

Paper survey. Colleges order surveys from NCHEMS,administer surveys, and return toNCHEMS for scoring.

NCHEMS mails results three weeks fromdate surveys are returned for scoring.

Paper survey. Multiple-choice testrequiring 48 to 120 minutes, depend-ing on content.

Institutions order surveys, administerthem to students, and return them toPACAT for scoring and analysis.

Must order surveys at least 15 daysprior to administration date. PACATscores surveys during the last full work-ing week of the month and mailsreports the first working week of themonth.

ALUMNI

Web-based survey comprised of foursections. Takes 15 to 20 minutes tocomplete.

Alumni visit web site to complete sur-vey. Models for working with individualinstitutions are under development.Institutions identify alumni cohorts,who Peterson’s then contacts anddirects to the online instrument.

Unlimited online availability or as arranged.

Page 33: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

32 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 2 (continued). Target Institutions and Samples, Participation, Format,Administration Procedure,and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

Evaluation/Survey Services

ACT

All types/New students, enrolled stu-dents, non-returning students, andalumni.

Since 1979, 1,000 institutions haveadministered more than 6 million stan-dardized surveys nationwide.

Faculty Survey

HERI

All types/Full-time undergraduate faculty and academic administrators.

In 1998–99, data were collected frommore than 55,000 faculty at 429 collegesand universities.

Institutional Performance Survey(IPS)

NCHEMS

All types (two-year and four-year versionsavailable)/Faculty, administrators, andboard members.

Information not available.

Institutional Priorities Survey(IPS)

Noel-Levitz

All types (two-year and four-year versionsavailable)/Faculty, administrators, andstaff.

More than 400 institutions have used the IPS.

Program Self-Assessment Service(PSAS) and Graduate ProgramSelf-Assessment Service (GPSAS)

ETS

College and university programs/Students, faculty, and alumni (separatequestionnaires for each group). GPSAShas separate questionnaires for master’sand Ph.D. programs.

In 1999–2000, 65 institutions and12,000 students, faculty members, andalumni participated.

Student Outcomes InformationSurvey (SOIS)

NCHEMS

All types (two- and four-year versionsavailable)/ Questionnaires for enteringstudents, continuing students, formerstudents, graduating students, recentalumni, and long-term alumni.

Information not available.

SERIES OF INSTRUMENTS

FACULTY AND INSTITUTIONAL SURVEYS

Page 34: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 33

Paper survey. Colleges order surveys from NCHEMS,administer surveys, and return them toNCHEMS for scoring.

NCHEMS mails results two weeks fromdate surveys are returned for scoring.

FORMAT ADMINISTRATION PROCEDURE TIMELINE

Most surveys are four-page paper doc-uments; one is two pages in length.

Administration procedures are estab-lished at the discretion of the institution.

Institutions mail completed surveys toACT for processing. Scanning occursevery second and fourth Friday; ACTproduces and mails reports three to fourweeks after scanning.

SERIES OF INSTRUMENTS

Four-page paper survey. Faculty surveys are sent to campuses inthe fall. Campuses are responsible forsurvey distribution. HERI provides out-going envelopes and pre-addressed,postage-paid return envelopes thatrespondents mail directly to HERI’s survey processing center.

Institutions register in the spring andsummer. HERI administers surveys inthe fall and winter. HERI issues campusprofile reports the following spring andsummer.

Paper survey. Colleges order surveys and distributethem. Surveys include a postage-paidreturn envelope for respondents toreturn survey directly to NCHEMS tomaintain anonymity.

NCHEMS returns results three weeksafter institutionally determined cut-offdate.

Paper survey. Institutions purchase and administer thequestionnaires and send completed ques-tionnaires back to ETS for reporting.

Processing begins the first working dayof each month. ETS ships reports aboutthree weeks after start of processing.

FACULTY AND INSTITUTIONAL SURVEYS

Paper survey. In spring 2001, the sur-vey also will be available on the web.

The paper survey takes about 30 minutesand can be distributed via various meth-ods on campus, including campus mail,face-to-face distribution, and staff meet-ings. The web version takes about 20minutes. URL and password can be e-mailed to staff.

Institutions can administer the IPS any-time during the academic year. Surveysgenerally arrive on campus within a weekof ordering. Institutions return com-pleted surveys to Noel-Levitz for process-ing. Reports are ready for shipmentwithin 12 to 15 business days.

Page 35: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

34 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 3. Reporting, Data Availability, LocalItems, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

Freshman Class Profile Service

American College Testing (ACT)

Paper report containing an executivesummary; college attractions; academicachievement, goals and aspirations;plans and special needs; high schoolinformation; competing institutions;and year-to-year trends. A range of freeand for-fee reports available.

Yes—national user data and college student profiles available.

Student Descriptive Questionnaire(SDQ)

The College Board

Institutions receive SDQ responses forstudents who indicate “yes” to theStudent Search Service on the registra-tion form.

Yes—national and state-level benchmarkreports available on paper and on The College Board web site.

Admitted Student Questionnaire(ASQ) and Admitted StudentQuestionnaire Plus (ASQ Plus)

The College Board

Highlight report (executive summary),detailed report with all data, competitorreport for ASQ Plus only, norms reportwith national data. Data file also available.

Yes—included in standard report.

College Student ExpectationsQuestionnaire (CSXQ)

Center for Postsecondary Researchand Planning (CPRP) at IndianaUniversity

Computer diskette containing raw insti-tutional data file and output file withdescriptive statistics. Schools alsoreceive a hard copy of the output file.Additional analyses available for a fee.

No—tentative norms are under develop-ment and will be available summer2001. Norms reports will include relevant comparison group data byCarnegie type.

Cooperative Institutional ResearchProgram (CIRP) FreshmanSurvey/Entering Student Survey(ESS)

Higher Education Research Institute(HERI) at UCLA and AmericanCouncil on Education (ACE)

Paper report with local results, andaggregate results for similar institutionsderived from the national norms.Separate profiles for transfer and part-time students. Special reports and datafile available for a fee.

Yes—national results included in stan-dard report.

ENTERING UNDERGRADUATES

Page 36: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 35

Contains up to 21 additional localquestions. Consortia analyses avail-able for a fee.

Participation fee of $400 plus $1 perreturned survey for processing.

Higher Education Research Institute,UCLA Graduate School of Education andInformation Studies, 3005 Moore Hall—Box 951521, Los Angeles, CA 90095-1521. Phone: 310-825-1925. Fax: 310-206-2228. E-mail: [email protected]

www.gseis.ucla.edu/heri/cirp.htm

LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

By providing additional data, cam-puses can use this service to summa-rize variables at all stages of theenrollment funnel: students who sub-mitted their ACT scores, those whoapplied, those who were admitted, andthose who enrolled.

There is no cost for the basic information. Freshman Class Profile ServiceCoordinator Phone: 319-337-1113

www.act.org/research/services/freshman/index.html

ENTERING UNDERGRADUATES

Standard overlap with all commonacceptances in both surveys; specificoverlap analysis includes five competi-tor schools in ASQ Plus. Both surveyscan be customized by specifying characteristics of interest to school.Limited local questions are available.

ASQ $600; ASQ Plus $925.Questionnaire Printing Fee: ASQ $.55per form; ASQ Plus $.60 per form.Processing Fee: ASQ $2.00 per formreturned; ASQ Plus $2.25 per formreturned.

Phone: 800-927-4302 E-mail: [email protected]

www.collegeboard.org/aes/asq/html/index000.htm

None. No cost. Educational Testing ServicePhone: 609-771-7600E-mail through: www.collegeboard.org/html/communications000.html#SAT

Information about data tapes: www.collegeboard.org/sat/html/admissions/serve013.html

Local additional questions and consor-tia analyses are available.

For regular paper survey administered bythe institution, the cost is $125 plus $.75per survey and $1.50 scoring fee percompleted questionnaire.

College Student ExpectationsQuestionnaire, Center for PostsecondaryResearch and Planning, IndianaUniversity, Ashton Aley Hall Suite 102,1913 East 7th St., Bloomington, IN47405-7510. Phone: 812-856-5825. Fax: 812-856-5150. E-mail: [email protected]

www.indiana.edu/~cseq

Page 37: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

36 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

College Student ExperiencesQuestionnaire (CSEQ)

CPRP

Computer diskette containing raw insti-tutional data file and output file withdescriptive statistics. Schools alsoreceive a hard copy of the output file.Additional analyses available for a fee.

No—an annual national report is notplanned; however, norms reports areregularly updated and institutionalreports include relevant aggregated comparison group data by Carnegietype.

Community College StudentExperiences Questionnaire(CCSEQ)

University of Memphis, Center for theStudy of Higher Education

Diskette containing all responses andscores for students and a summary com-puter report are available for a fee of $75.

Yes—national data can be found in the CCSEQ manual, which is availablefor $12.

Faces of the Future

American Association of CommunityColleges (AACC) and ACT

Participating schools receive nationalresults, an individualized report withinformation about their student popula-tion, a report comparing their data to thenational data, and a data file.

Yes.

College Student Survey (CSS)

HERI

The Campus Profile Report includes theresults of all respondents. The Follow-upReport contains matched CIRP and CSSresults for easy comparison. Specialreports and data files available for a fee.

Yes—national aggregates for similarinstitutions. Complete national aggre-gates available from HERI.

ENROLLED UNDERGRADUATES

National Survey of StudentEngagement (NSSE)

CPRP

Comprehensive institutional profile,aggregated comparison data for similarschools, and national benchmark report.Includes data file, means and frequencydistributions on all items, and signifi-cance tests. Special analyses available fora fee.

Yes—aggregated comparative informa-tion included in standard institutionalreport and annual national report.

Page 38: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 37

Local questions are available.Consortia analyses are available for afee.

$450 participation fee plus $1 for eachsurvey returned for processing.

Higher Education Research Institute,UCLA Graduate School of Educationand Information Studies, 3005 MooreHall—Box 951521, Los Angeles, CA90095-1521. Phone: 310-825-1925. Fax:310-206-2228. E-mail: [email protected]

www.gseis.ucla.edu/heri/cirp.htm

LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

ENROLLED UNDERGRADUATES

Up to 20 local questions are available.CCSEQ can be used in statewide assess-ment efforts to provide data for strate-gic planning and staff development.

$.75 per survey purchased and $1.50 persurvey for scoring; $75 for print reportand data on diskette.

Center for the Study of Higher Education,308 Browning Hall, The University ofMemphis, Memphis, TN 38152. Phone: 901-678-2775. Fax: 901-678-4291. E-mail: [email protected]

www.people.memphis.edu/~coe_cshe/CCSEQ_main.htm

Colleges may add up to 10 local items.Statewide administration is available.

AACC/ACT administration: $.75 persurvey (includes scoring) plus $50 pro-cessing and reporting fee. Standardadministration: $13.65 per 25 surveysplus $.80 each for scanning, $50 process-ing fee, and $50 reporting fee.

Contact Kent Phillippe, Senior ResearchAssociate, AACC. Phone: 202-728-0200,ext. 222 E-mail: kphillippe@ aacc.nche.edu

www.aacc.nche.edu/initiatives/faces/f_index.htm

Local questions are available for a$250 charge. Consortia analyses areavailable.

For regular paper administration: $125institutional registration fee plus $.75 persurvey ordered and $1.50 scoring fee percompleted questionnaire. Web adminis-tration cost is $495 institutional registra-tion fee plus $2.25 per completed survey.

College Student ExperiencesQuestionnaire, Center for Postsecondary Research and Planning,Indiana University, Ashton Aley HallSuite 102, 1913 East 7th St.,Bloomington, IN 47405-7510. Phone: 812-856-5825. Fax: 812-856-5150. E-mail: [email protected]

www.indiana.edu/~cseq

Schools or state systems (i.e., urban,research, selective privates) may forma consortium of at least eight institu-tions and can ask up to 20 additionalconsortium-specific questions.

$275 participation fee plus per-studentsampling fee based on undergraduateenrollment. Total cost range varies, fromapproximately $2,500 to $5,500.Targeted over-sampling is available foradditional per-student fee.

National Survey of Student Engagement,Center for Postsecondary Research andPlanning, Indiana University, AshtonAley Hall Suite 102, 1913 East 7th St.,Bloomington, IN 47405-7510. Phone:812-856-5824. Fax: 812-856-5150. E-mail: [email protected]

www.indiana.edu/~nsse

Page 39: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

38 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

Student Satisfaction Inventory(SSI)

Noel-Levitz

The standard campus report includes themean data for all students alongsidenational averages. Optional reports andraw data are available for an additional fee.

Yes—four national comparison groupsare standard, are available based on insti-tution type, and are updated twice a year.

Adult Student Priorities Survey(ASPS)

Noel-Levitz

The standard campus report includes themean data for all students alongsidenational averages. Optional reports andraw data are available for an additionalfee.

Yes—the national comparison groupincludes data from four-year and two-year institutions and is updated twice ayear. As of May 2000, the national group included 4,063 students from 32institutions.

Collegiate Assessment of AcademicProficiency (CAAP)

ACT

Institutional summary report and twocopies of each student’s score report.Certificate of achievement for studentsscoring at or above national average onone or more modules. Supplementalreports and data file available for a fee.

Yes—for freshmen or sophomores at two- or four-year, public or private institutions.

Your First College Year (YFCY)

HERI and Policy Center on the FirstYear of College at Brevard College

Paper report provides in-depth profile offirst-year students by sex, and compara-tive data for similar institutions. Data filealso available.

Yes.

ENROLLED UNDERGRADUATES

Academic Profile

Educational Testing Service (ETS) andThe College Board

Summary score report contains both criterion-referenced proficiency levelsand norm-referenced scores. Scores varyslightly from long form to short form.Data diskette included in fee.

Yes—provided by class level and byCarnegie classification.

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Tasks in Critical Thinking

ETS

Scores are reported as the percentage ofstudents demonstrating proficiency ineach of the three skill areas—inquiry,analysis, and communication, as measured by the tasks.

No.

Page 40: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 39

LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

ENROLLED UNDERGRADUATES

Special comparison group reports areavailable for a fee.

$50 processing and setup fee plus $1.50to $2.95 per survey, depending on thequantity ordered.

Julie Bryant, Program Consultant: [email protected] or LisaLogan, Program Consultant: [email protected]. Phone: 800-876-1117

www.noellevitz.com

Nine optional local questions may beadded at no additional charge.

$330 participation fee plus $8.95 to$16.55 per student, depending on thenumber of students and the number ofmodules purchased (includes instru-ments, scoring, and reporting).

ACT, Outcomes Assessment, P.O. Box168, Iowa City, IA 52243-0168. Phone:319-337-1053. Fax: 319-337-1790. E-mail: [email protected]

www.act.org/caap/index.html

Special comparison group reports areavailable for a fee.

$50 processing and setup fee plus $1.50to $1.95 per survey, depending on thequantity ordered.

Julie Bryant, Program Consultant: [email protected] or Lisa Logan,Program Consultant: [email protected]. Phone: 800-876-1117

www.noellevitz.com

Up to 50 local questions are available.Institutions can customize compari-son groups from list of participatingschools (minimum of eight per group).

$300 annual institutional fee. Pricevaries by form and number purchased($9 to $11.25 for short form and $14.50to $16.75 for long form). $2.25 each foroptional essay (includes scoring guide).Minimum order of 50 tests.

Jan Lewis at 609-683-2271. Fax: 609-683-2270. E-mail: [email protected]

www.ets.org/hea/heaweb.html

Not available during pilot stages.Local items and consortia options will be available with the full-scaleadministration beginning in 2002.

No fees during pilot stages. Higher Education Research Institute,UCLA Graduate School of Educationand Information Studies, 3005 MooreHall—Box 951521, Los Angeles, CA90095-1521. Phone: 310-825-1925. Fax:310-206-2228. E-mail: [email protected]

www.gseis.ucla.edu/heri/yfcy

www.brevard.edu/fyc/Survey/YFCYsurvey.htm

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

None. $16.50 each for first 30 to 100. Jan Lewis at 609-683-2271. Fax: 609-683-2270. E-mail: [email protected]

www.ets.org/hea/heaweb.html

Page 41: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

40 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

Area Concentration AchievementTests (ACAT)

Project for Area ConcentrationAchievement Testing (PACAT) atAustin Peay State University

Schools receive two copies of the scorereport for each student. Standard scorescompare students to five-year nationalsample. Raw percentage scores of itemscorrect also included. Additional analy-ses and data file available for a fee.

Yes.

Comprehensive AlumniAssessment Survey (CAAS)

National Center for Higher EducationManagement Systems (NCHEMS)

Analysis includes one analytical report.Data file available for a fee.

No.

College Results Survey (CRS)

Peterson’s, a Thomson LearningCompany

Institutions receive data file of responsesin spreadsheet format for analyses.Analytic tools for institution-based analy-ses and peer comparisons are beingexplored.

No. Analytic tools for peer comparisonshave been developed and are availableto participating institutions at a secureweb site.

Major Field Tests

ETS

Reports include individual scaled scores,departmental summary with departmentmean-scaled scores, and demographicinformation. Special score reports available for an additional fee.

Yes—for each test. Percentile tables forall seniors taking the current form ofeach test are published each year.Departments may obtain custom com-parative data for an additional fee.

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Student Outcomes InformationSurvey (SOIS)

NCHEMS

Analysis includes one analytical report.Data file available for a fee.

No.

ALUMNI

Evaluation/Survey Services

ACT

Basic reporting package includes a sum-mary report, graphics report, and nor-mative report. Other reports and datafile available for a fee.

Yes.

SERIES OF INSTRUMENTS

Page 42: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 41

LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Up to 20 local questions are availablefor a data entry fee of $1.25 per question.

$.85 per questionnaire plus shipping andhandling. $200 for analysis (includesone analytical report).

NCHEMS, P.O. Box 9752, Boulder, CO80301-9752. Clara Roberts at 303-497-0390 E-mail: [email protected]

www.nchems.org/surveys/caas.htm

Collaborative administration amonginstitutions can be explored.

There is no respondent cost to completethe online CRS.

Costs for institutional applications of theCRS are being explored as collaborativemodels are identified.

Rocco P. Russo, VP, Research,Peterson’s, a Thomson LearningCompany, Princeton Pike CorporateCenter, 2000 Lenox Drive, P.O. Box67005, Lawrenceville, NJ 08648. Phone:609-896-1800 ext. 3250, toll-free: 800-338-3282 ext. 3250. Fax: 609-896-4535E-mail: [email protected]

www.petersons.com/collegeresults

Schools can customize most tests tomodel the test after major require-ments. Art and Literature in Englishcannot be customized. Social workcustomization will be available in June 2001.

Price ranges from $4 to $11 per studentsurvey depending on discipline, pre-testvs. senior test, and two-year vs. four-yearschool. Price includes use of materials,scoring, two copies of the score report,and long-term maintenance of score histories.

PACAT, Box 4568, Austin Peay StateUniversity, Clarksville, TN 37044.Phone: 931-221-7451. Fax: 931-221-6127. E-mail: [email protected]

http://pacat.apsu.edu/pacat

Up to 15 local questions are availablefor a data entry fee of $1.25 per question.

$.30 per questionnaire plus shipping andhandling. $150 for analysis, whichincludes one analytical report.

NCHEMS, P.O. Box 9752, Boulder, CO80301-9752. Clara Roberts at 303-497-0390 E-mail: [email protected]

www.nchems.org/sois.htm

Group scores are reported for up to 50locally written questions.

$23.50 per test ($23 for 100 or more),plus shipping. Includes TestAdministration Manual, standard pro-cessing, and national comparative data.

Dina Langrana at 609-683-2272 E-mail: [email protected]

www.ets.org/hea

ALUMNI

Up to 30 local questions are available.Consortia reports are available for a fee.

$14.35 for 25 four-page surveys. $.84 persurvey returned for processing. $168 for basic reporting package (summaryreport, graphics report, and normativereport).

ACT, Postsecondary Services, OutcomesAssessment, P.O. Box 168, Iowa City, IA52243-0168. Phone: 319-337-1053 Fax: 319-337-1790 E-mail: [email protected]

www.act.org/ess/index.html

SERIES OF INSTRUMENTS

Page 43: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

42 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality

TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

Institutional Performance Survey(IPS)

National Center for Higher EducationManagement Systems (NCHEMS)

Report contains data for total campus,total faculty, and targeted populations.

No.

Institutional Priorities Survey(IPS)

Noel-Levitz

The standard campus report includes themean data for all respondents alongsidenational averages for like-type institu-tions. Optional reports (includingIPS/SSI reports) and raw data are avail-able for an additional fee.

Yes—three national comparison groupsare standard, are available based oninstitution type, and are updated twicea year.

Program Self-Assessment Service(PSAS) and Graduate ProgramSelf-Assessment Service (GPSA)

ETS

Summary data report includes separateanalyses for faculty, students, andalumni. Optional subgroup reports anddata file available for a fee.

No.

Faculty Survey

HERI

Campus profile report includes facultyresponses by gender. Separate profiles of teaching faculty and academic admin-istrators also are provided. Normativeprofile includes national data by institu-tional type. Data file is also available.

Yes—in normative profile report.

FACULTY AND INSTITUTIONAL SURVEYS

Page 44: Choosing Among Surveys and Other Assessments of …apps.airweb.org/surveys/measurequality.pdf · American Council on Education Center for Policy Analysis Choosing Among Surveys and

American Council on Education/Association for Institutional Research 43

LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

FACULTY AND INSTITUTIONAL SURVEYS

Special comparison group reports areavailable for a fee.

$140 processing and setup fee plus $1.50to $2.95 per survey, depending on thequantity ordered.

Julie Bryant, Program Consultant: [email protected] or LisaLogan, Program Consultant: [email protected]. Phone: 800-876-1117

www.noellevitz.com

Local questions are available. $37 for 25 questionnaires plus shippingand handling (minimum purchase of 75questionnaires). $150 for summary datareport plus $3.99 per booklet processed.

Karen Krueger at 609-683-2273Fax: 609-683-2270E-mail: [email protected]

www.ets.org/hea/heaweb.html#psas

Up to 20 local questions are available. $1,600 for 100 questionnaires. Includessurvey, pre-paid return postage, stan-dard analyses, and report summary.After first 100 questionnaires, $150 foreach additional 50.

NCHEMS, P.O. Box 9752, Boulder, CO80301-9752. Clara Roberts at 303-497-0390E-mail: [email protected]

www.nchems.org/surveys/ips.html

Local questions are available.Consortia analyses are available for a fee.

$325 plus $3.25 per returned survey. Higher Education Research Institute,UCLA Graduate School of Educationand Information Studies, 3005 MooreHall—Box 951521, Los Angeles, CA90095-1521. Phone: 310-825-1925. Fax:310-206-2228. E-mail: [email protected]

www.gseis.ucla.edu/heri/cirp.htm


Recommended