Post on 08-Jun-2020
transcript
Survey Research Laboratory
Introduction to Web Surveys
Survey Research Laboratory
University of Illinois at Chicago
October 2010
Web Surveys
� First reported use in early 1990’s
� Dramatic increase in use over the past decade
� Numerous web survey software packages now availableavailable
Basic Advantages of Web
Surveys� Speed
� Cost
� Convenient (self-administered)
� Multi-media delivery (sound, video)
� Power of computer-assisted programming� Power of computer-assisted programming
� Unique, hi-tech
� Similar arguments were made regarding CATI (in the 1970s) and CAPI (in the 1980s) technologies
Typology of Web Surveys
With thanks to Mick Couper, University of Michigan Survey Research Center(see: Couper, Web Surveys: A Review of Issues and Approaches. Public (see: Couper, Web Surveys: A Review of Issues and Approaches. Public Opinion Quarterly 2000;
64: 464-494)
•Non-probability methods
•Probability methods
Non-probability Methods
1. Polls as entertainment
2. Unrestricted self-selected surveys2. Unrestricted self-selected surveys
3. Volunteer opt-in panels
Probability Methods
4. Intercept surveys
5. List-based samples
6. Web option in mixed-mode surveys6. Web option in mixed-mode surveys
7. Pre-recruited panels of internet users
8. Pre-recruited panels of full population
Type 1: Entertainment Polls
� “question of the day”
� Intended primarily for entertainment
� No pretense at “science” or representativeness� No pretense at “science” or representativeness
� Basically harmless so long as audience can distinguish them from real surveys
Type 2: Unrestricted self-selected
surveys
� Open invitation on portals, frequently-visited web sites, or dedicated “survey” sites
� No access restrictions� No access restrictions� Ballot-stuffing possible
� Equivalent of 1-900 poll or magazine insert survey
� Main distinction with type 1 are claims to legitimacy are sometimes made here
Type 3: Volunteer panels of internet
users
� Create volunteer (opt-in) panel � using open invitation, recruit large number of
people willing to do web surveys
� Use quota controls or random sampling of select persons from this group for a select persons from this group for a particular survey
� Control access through invitation & PIN
� Although 2nd step (selection within panel) is controlled, 1st step is self-selected and uncontrolled
Type 3: Continued
� Claims to generalize to total population
� Maybe most common approach to web surveys now
� As with all panels, some of these pay respondents to participaterespondents to participate
� Some are claiming that these panels are equal or better than other forms of data collection based on probability methods
Probability-Based Methods
#4: Intercept Surveys
� Target is visitors to a web site� Customer satisfaction
� Web site evaluation
Systematic sample commonly used – every � Systematic sample commonly used – every nth visitor
� No coverage problem because the population of interest is active web users
� Biggest problem is nonresponse
� Also problem of timing – when to intercept?
#5: List-Based Samples of High-
Coverage populations
� Recruit and invite those with web access to participate
� Restricted to current internet usersRestricted to current internet users
� Controlled access
� Nonresponse occurs at many stages in the process but can be measured (via analyses of baseline data)
� High coverage population examples:� College students
� Members of professional organizations
� IT professionals
#6: Mixed-Mode Designs with Choice
of Completion Method
� Web survey as part of a mixed mode design
� Example: mail survey with invitation to do via web
� Requires controlled access
� Concerns over equivalence of measurement� “mode effects”
� More expensive than single mode survey
� Offering respondents a choice helps confront nonresponse problems
#7: Pre-recruited Panels
of Internet Users
� Random probability methods used to contact and invite persons with internet access to participate
� This approach is limited to active internet users� This approach is limited to active internet users
� More expensive than some other approaches
� May be difficult to assess nonresponse
#8: Probability Samples
of Full Population
� Start with probability sample of target population
� Give everyone access to internet in exchange for
participation
� Only approach with potential to be representative of
general population
� Biases and errors can be measured
� Nonresponse, panel effects and costs are big
concerns
� Example: Knowledge Networks
Designing Web Questionnaires
Basic Design Approaches
• Static web questionnaire• Survey in single HTML document
• Respondents can scroll through document
• Data sent to server once when survey is completed
• Interactive web questionnaire• Interactive web questionnaire• Questions are delivered one at a time or in modules
• Data is sent to server after each screen is completed
• Conducive to use of skip patterns, consistency checks, range checks, etc.
Static Web Questionnaires
• Very similar to mail and other self-administered questionnaires
• Can minimize download time
• Respondents can skip questions, but the • Respondents can skip questions, but the process is not usually automated• Hypertext links can be used to facilitate skips
• All information is lost if respondent quits before finishing
• More advantageous for short questionnaires
Interactive Web Questionnaires
• This approach permits the use of all
computer-assisted programming devices
• May increase length of survey due to
additional download timeadditional download time
• Partial data is captured for respondents who
quit before finishing questionnaire
• More advantageous for longer and more
complex questionnaires
Progress Indicators
• The purpose is to motivate respondents to complete the questionnaire in the absence of an interviewer• Couper et al. (2001): 89.9% completed survey with progress
indicator vs. 86.4% completing survey without one
• Very useful in interactive questionnaires, where respondent does not know how long the questionnaire is
• Not necessary in static questionnaires where • Not necessary in static questionnaires where respondents can determine the length by scrolling through it
• May add to survey length if increase download time• There is some concern of increased break-offs
• Transition sentences are an alternative
• Empirical evidence regarding effectiveness not clear
General Screen Design• Do not use background color or images
• Background colors can create contrast & reading
problems
• “visual noise”
General Screen Design #2
• Be aware that images may bias responses• Witte et al (2004) – National Geographic Survey
• Images increased support for species protection
• Couper et al (2007) – healthy vs. sick person image• When exposed to fit person, respondents consistently rated their • When exposed to fit person, respondents consistently rated their
own health as lower than when exposed to sick person
• Use upper right corner for contact information• Privacy/IRB information can be clickable from there
• If top of screen format is consistent:• Respondents will tend to ignore that section across
pages• “Banner-Blindness”
General Screen Design #3
• Access to other relevant information can also be provided:
• Answers to commonly asked questions about the
surveysurvey
• pdf versions of the full questionnaire
Effect of Color on Web Survey
Completion
• Do not overuse color but use it consistently• Use red only for emergency messages
• Red-green distinctions a problem with persons who are color-blind • 10% of males are color blind
• 99% of color blind persons cannot distinguish green & red
• White or off-white backgrounds seem to work best• Some evidence that R’s view black-on-white web
pages as being more ‘professional’ than white-on-black web pages
• Couper (2008) prefers light blue backgrounds
Color, continued
• For maximum readability, should be high contrast between text color and background color
• Bright colors are easier to see than pastels
• Colored backgrounds often used by spammers • Colored backgrounds often used by spammers and may reduce response rates
Text
• Always avoid small font sizes (use 10-12 point)
• Appears to be some preference for Arial over Times Roman font
• Do not overuse bold, underline, italics and other forms of emphasisforms of emphasis
Question Presentation
• Avoid requiring R to horizontally scroll
• Avoid any scrolling may be best
Question Presentation
• Avoid requiring R to horizontally scroll• Avoid any scrolling may be best
• No agreement about inclusion of question numbers • Excluding them may avoid skip logic confusion• Excluding them may avoid skip logic confusion
• Likert questions (fully-labeled) should be displayed vertically
Question Presentation #2
• Respondents less likely to skip words when lines are kept short
• Provide computer-operating instructions at the precise point when a R may need to use that informationinformation
• When # of responses cannot be fitted on single screen:• Double- or triple-banking may be best approach
• Place a box around the categories in order to ‘group’ them as being relevant to the question
Question Presentation #3
• Visibility principle
• Options that are visible are more likely to be
selected than those that are not visible until the R
takes some action to display them
• Response models• Response models
• Serial processing model
• Search options for pre-existing judgment
• Deadline processing model
• Spend certain amount of time and select best answer found before cognitive deadline (a form of satisficing)
Common Types of Response Options
for Web Surveys
1. Radio buttons or boxes
2. Drop-down boxes
3. Check boxes
4. Slider bars
5. Text boxes
6. Open-ended questions
Radio Buttons
•Options are typically mutually exclusive
Be careful not to use long grids
that lose column headings:
Boxes instead of buttons
Drop-Down Boxes
•Useful only for closed lists of response options
•Can be designed to allow for single or multiple •Can be designed to allow for single or multiple
choices
•Options provided must be exhaustive
•Drop boxes more difficult to use than radio buttons
Beware of Scroll Mice� Healy (2007)
� Drop-downs (compared to radio buttons) led to
higher item nonresponse and longer response times
� Respondents using scroll mice to complete the
survey were prone to accidentally changing an survey were prone to accidentally changing an
answer if presented with drop-down questions
Check Boxes
•Unlike radio buttons, multiple choices can be clicked
via check boxesvia check boxes
Radio Button/Check Box Hybrid
Slider Bars (a.k.a. visual analog
scale, graphic rating scales,
“sliders”)
Slider Bars - Research• Random experiment by Bayer & Thomas
(2004) of Harris Interactive• Slider bars took about twice as long for completion
as any other scale type (including semantic differentials, likert, etc)
• Answering 2 slider bar questions averaged 42.3 • Answering 2 slider bar questions averaged 42.3 seconds, compared to 21.3 seconds for semantic differential questions
• Couper (2008) says results using slider bars are “quite similar” to what is obtained from a scale that uses radio buttons
Open-ended Questions
•providing more space encourages respondents to •providing more space encourages respondents to
provide longer answers
Present Single or Multiple Items per
Screen?
• For interactive questionnaires, multiple items per screen:• Are completed more quickly by respondents
• May provide more context• May provide more context
• Intercorrelations among items are consistently higher when grouped together on one screen (Couper et al. 2001).
• Also, multiple item screen versions:• take less time to complete
• produce less missing data
Survey Navigation
• A consistent format should be followed
• Use action buttons that are different from any
response input elements such as radio buttons
• “next screen” or “next question” buttons should be on • “next screen” or “next question” buttons should be on
all pages
• Crawford et al (2005) recommends putting them in the
lower left corner
• “previous screen” or “previous question” buttons should
be in the bottom right corner
Key Point
• Never force respondents to answer a question
• Adds to frustration
• IRB implications
• No other questionnaire formats ‘force’ answers• No other questionnaire formats ‘force’ answers
Key Questionnaire Design Principles
Summary
• Minimize respondent burden and frustration
• The fewer ‘clicks,’ the better
• The less scrolling, the better
• The fewer distractions, the better• The fewer distractions, the better
• The fewer problems knowing how to navigate the
questionnaire, the better
• The less download time required, the better
• Never forcing respondents to answer questions
Some other design
recommendations to consider (from
Couper 2008):
• Remove unneeded content and clutter
• Minimize the number of different colors and fonts being used
Use consistent design formats through the • Use consistent design formats through the entire instrument
• Avoid putting too much material on any page
Summary� Web surveys vary greatly in their goals, design,
execution, analysis, etc.
� Evaluation must be done in the context of the type of survey being conducted
� Cannot say that all web surveys are good or bad
� Methodological research is being done on a moving target
Thank You
timj@uic.edu