+ All Categories
Home > Documents > Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of...

Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of...

Date post: 12-Jan-2016
Category:
Upload: anthony-francis
View: 219 times
Download: 0 times
Share this document with a friend
Popular Tags:
29
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment Surveys Montreal, Quebec, Canada June 18-21, 2007 Evaluation and Implementation of EDR in School-Based Research
Transcript
Page 1: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Lesli ScottAshley Bowers

Sue Ellen HansenRobin Tepper Jacob

Survey Research Center, University of MichiganThird International Conference on Establishment Surveys

Montreal, Quebec, CanadaJune 18-21, 2007

Evaluation and Implementation of EDR in School-Based Research

Page 2: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

IntroductionSurvey Research Center conducts national school-based

studies of elementary/secondary education and of children and adolescents.

Typically, SRC school-based studies include a data collection activity to obtain information about each school from the principal.

Focus of this presentation is on assessing the potential effectiveness of using Web surveys to obtain school information from principals in the U.S.

Page 3: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

IntroductionCurrently, most researchers choose the traditional mail mode

• Use multiple contact – tailored approach (Dillman), scannable instruments, and incentive.

• Attain 60-85+% response rates.

One SRC research effort over past couple years used Web mode

• Principals in a large district completed daily activity logs during 3 one-week periods each of three years (70-90% response).

• The same principals also completed annual Web surveys for three years (70-90% response).

• These studies provided some evidence that Web surveys can attain acceptable response rates in regional school studies.

Page 4: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Key Research Questions

This presentation reports on two very recent studies that help evaluate the ability to transition some national school studies to Web mode.

Key research questions considered include:• What’s the impact of Web mode on coverage error?• What’s the impact of Web mode on sampling error?• What’s the impact of Web mode on nonresponse error?• What future research is needed?

Page 5: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Two Recent Studies (1)Study 1: Principals’ Use of Internet Study (WEB)

• In Spring 2007, we administered a brief Web survey.

• Main purpose was to assess properties of the sample through phases of the study.

• We developed a national sample of 500 schools randomly drawn from U.S. Department of Education database.

• We carried out Internet searches and schools calls to obtain principal names and email addresses.

• We sent principals UPS delivered pre-notification letter with $5 incentive then followed with up to 3 email contacts.

Page 6: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Study 2: Principals’ Mode Preference Study

• Our 2nd study analyzed similar questions in the WEB study and a School Health Policy (MAIL) survey.

• Main purpose was to learn about principals’ access to Internet and preference for mail or Web mode.

• Surveys used nationally representative samples of 500 schools covering K-8th grade (WEB) & 600 schools covering 8th-12th grade (MAIL).

Two Recent Studies (2)

Page 7: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Web Mode & Coverage Error

Coverage error considers whether the list used to create the study sample includes all members of the population and probability of selection is known.

Hypothesis: Web mode provides nearly universal access to principals. Principals we can and can’t access are not different.

Page 8: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Coverage – All Principals Listed?

U.S. Department of Education provides a census of schools.

Recent reports indicate that virtually all schools have high speed Internet access.

What about principals? Our studies helped us answer:• Do principals use high speed Internet?• Can we obtain email addresses for principals?• How similar are schools where we can deliver email to

principals and schools where we cannot?

Page 9: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Coverage – Internet AccessNational Center for Education Statistics report (2006)

– 100% of schools have Internet.– 97% of schools have high speed connections.

Answers to questions on our Web and mail surveys showed that principals use high speed connections.

Hi-Speed Internet Access

At School

WEB Study 97%

MAIL Study 98%

Preliminary Results

Page 10: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Coverage – Email Available?

U.S. Department of Education databases do not include principal name or email addresses.

Our Web study showed that we could use procedures to obtain names and emails.

Obtaining Email Addresses

Obtained During 1st Internet Search 55%

Obtained by Adding School Calls 90%

Successful Email Delivery to Principal 84%

Preliminary Results

Page 11: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Coverage–Similar to Sample?Our study showed no difference in key school characteristics for

principals with and without deliverable emails.

School Characteristic Differences - Deliverable Emails vs. Sample

Test Statistic

p

Significantly Different?

Locale chi-square=1.49 0.98 No Region chi-square=0.69 0.87 No Number of Students t=0.76 0.45 No Number of Non-White Students

t=0.01

0.99 No

Number of Free/ Reduced Lunch Students

t=-0.04

0.97 No

Preliminary Results

Page 12: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

• Nearly all principals have access to high speed Internet. • It is possible to obtain email addresses for most

principals using a procedure that involves Internet searches and telephone calls to schools.

• There are no statistically significant differences between schools where we can deliver email to principals and those we cannot on key school measures – that is, there is no evidence of coverage bias.

Coverage Error - Conclusions

Page 13: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Web Mode & Sampling ErrorSampling error considers the precision of estimates that can

be made based in part on the number of units included in the random sample.

Hypothesis: Web gains more precision for same cost as mail mode.

Our study helped us identify the cost drivers for the two modes and we estimated the magnitude of difference between them.

Page 14: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Sampling – Main Cost Factors

Web and mail survey costs include:

• Sample Development• Pre-Notification with Incentive• Questionnaire Development• Application Programming• Survey Administration • Post Collection Processing

Page 15: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Sampling – Costs (1) (Hypothetical 2-page Qstr)

Web Mode (n=500) Mail Mode (n=500)

Sample DevelopmentNCES DataInternet SearchSchool CallsPreload ~$2,700

NCES DataPurchase NamesSchool CallsMail Merge ~$1,700

Pre-Notification withIncentive

MaterialsPostageIncentiveLabor ~$5,900

MaterialsPostageIncentiveLabor ~$5,900

Paper Questionnaire andMaterials Development

Questions ProvidedFormat Paper QstrsPrint QuestionnairesPrint LettersPostage ~$9,500

Page 16: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Sampling – Costs (2) (Hypothetical 2-page Qstr)

Web Mode (n=500) Mail Mode (n=500)

Programming Web ~$500 DDE ~$500

Survey Administration1st Email2nd Email3rd Email ~$550

1st Mail Logging2nd Mail Logging3rd Mail Logging ~$1,800

Data Processing Data Handling ~$350DDEData Handling ~$600

TOTAL ~$10,000 ~$20,000

Page 17: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

• The cost to conduct a Web survey of principals should almost always be less than the cost to conduct the survey using the traditional mail mode.

• Cost savings can be used to increase sample size – thus, reducing sampling error.

• In addition, the marginal cost of adding a case in a Web survey is generally smaller than the cost of adding a case in a mail survey.

Sampling Error - Conclusions

Page 18: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Nonresponse error considers whether sample members complete the survey and whether the responders have different characteristics than nonresponders.

Hypothesis: Participation in Web surveys is at least as good as mail. Principals who do participate and those who do not look similar on key school characteristics.

Web Mode & Nonresponse Error

Page 19: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Nonresponse – How Willing?

Our studies helped us answer the following:

• Are principals receiving requests and choosing to respond to Web surveys?

• Do principals have a preference for Web or mail surveys?• Can we obtain response rates for a Web survey of

principals that are comparable to mail?• Do those principals who participate in a Web survey look

similar to those who do not on key school characteristics?

Page 20: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Nonresponse – Web Survey Experience

• On average, MAIL Study principals received more than one Web survey request during past year. – Most completed at

least one.• WEB study principals

have participated in Web surveys at even higher rates.

WEB Study

Principals (%)

MAIL Study

Principals (%)

0 requests 21 21

1-3 requests 54 29

4-6 requests 17 27

7+ requests 8 23

Preliminary Results

Page 21: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Nonresponse – Preference Web vs. Mail

• Most MAIL study and WEB study principals indicated that they prefer Web surveys over mail surveys.

• Few differences in preference by school characteristics or available principal demographics (e.g., gender).

Page 22: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Nonresponse – WEB Response Rates

• The response rate achieved in the WEB study was comparable to typical mail survey response rates.

• The length of time needed to conduct a survey is greatly reduced using the Web compared to traditional mail data collection.

Page 23: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Nonresponse – Similar to Sample?Our study showed no difference in key school characteristics for

principals who participated in the WEB Study and those who did not.

School Characteristic Differences - Web Respondents vs. Sample

Test Statistic

p

Significantly Different?

Locale chi-square=6.16 0.52 No Region chi-square=4.01 0.26 No Number of Students t=0.17 0.86 No Number of Non-White Students

t=-0.92

0.36 No

Number of Free/ Reduced Lunch Students

t=-0.52

0.61 No

Preliminary Results

Page 24: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Nonresponse Error - Conclusions

• Principals are receiving Web survey requests and responding to them.

• Many principals express a preference for Web surveys over mail surveys.

• Our experience conducting a Web survey of principals in a random sample of U.S. public schools provides evidence that Web surveys can achieve response rates comparable to mail surveys.

• Principals who participated in the Web study generally looked similar to those who did not on key school characteristics.

Page 25: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Future Research (1)

Measurement Error Research• Are Web surveys measuring differently than other

modes?• What aspects of Web design influence quality of

response?

Page 26: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Complexity Found in Establishment Studies• What are impacts of introducing long and complex

instruments?• Will principals input information collected from school

files into a Web instrument?• Will principals forward portions of Web instruments to

other school staff?

Future Research (2)

Page 27: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Future Research (3)Some Coverage Error Research Questions:• What might we learn if we make contact with principals

in schools where we could not obtain deliverable email addresses?

• Are there some specific groups of principals that don’t want to be contacted and if so, what can we learn about them?

• Are email services in schools provided by school, district, state or other entities?

• How much do principals use home computers for work related activities? How often would they choose to complete work-related surveys at their homes?

• In addition to principals, can we obtain email addresses for multiple informants at the school who might assist with school surveys?

Page 28: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Future Research (4)

Experimental Design

• Ideally, our next step will be to set up research using

random assignment to Web and mail modes.

Page 29: Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.

Final Conclusions

Special Thanks to:

Andrew Hupp

Timothy Wright

Jackie McBride


Recommended