+ All Categories
Home > Documents > Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1,...

Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1,...

Date post: 21-Dec-2015
Category:
Upload: maria-copeland
View: 219 times
Download: 0 times
Share this document with a friend
Popular Tags:
23
Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1 , Søren Bøye Olsen 2 , and Konstantinos P. Tsagarakis 3 3 Democritus University of Thrace, Department of Environmental Engineering, Business Economics and Environmental Technology Lab, www.beteco.org [email protected] Towards a common standard – A reporting checklist for web-based stated preference valuation surveys 1 a) Organismos Georgikon Asfaliseon, Regional Branch of Eastern Macedonia & Thrace, 69100, Komotini, Greece b) Hellenic Open University, Parodos Aristotelous 18, 26335, Patras, Greece 2 Department of Food and Resource Economics, Faculty of Science, University of Copenhagen, Rolighedsvej 25, 1958 Frederiksberg C, Denmark
Transcript
Page 1: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Department of Environmental EngineeringDemocritus University of Thrace

Angeliki, N. Menegaki1, Søren Bøye Olsen2, and Konstantinos P. Tsagarakis3

3 Democritus University of Thrace, Department of Environmental Engineering, Business Economics and Environmental Technology Lab, www.beteco.org

[email protected]

Towards a common standard – A reporting checklist for web-based stated preference

valuation surveys

1 a) Organismos Georgikon Asfaliseon, Regional Branch of Eastern Macedonia & Thrace, 69100, Komotini, Greece

b) Hellenic Open University, Parodos Aristotelous 18, 26335, Patras, Greece

2 Department of Food and Resource Economics, Faculty of Science, University of Copenhagen, Rolighedsvej 25, 1958 Frederiksberg C, Denmark

Page 2: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

• Why we need valuation?

Page 3: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.
Page 4: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.
Page 5: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Προσεγγίσεις και μέθοδοι αξιολόγησης όταν δεν υπάρχουν αγορές

Non market valuation methods

Δαπάνες προς αποφυγή

Averting expenditures

Ωφελιμιστική τιμολόγηση

Hedonic pricing

Μέθοδος κόστους ταξιδιού

Travel cost method

Πιθανολογική αξιολόγηση

Contingent valuation

Πειραματική αξιολόγηση

Choice experiments

Αγορές υποκατάστασης

Surrogate markets

Υποθετικές αγορές

Hypothetical markets

Μεταφορά ωφελειών

Benefit transfer

Revealed preference techniques

Stated preference techniques

Page 6: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

We considered 182 web based valuation surveys

an update of the …

Page 7: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Figure 1. Numbers of collected surveys (2001-205).

Page 8: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Figure 2. Valuation methods perused in web surveys (2001-2015).

Page 9: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Figure 3. Allocation of incentives perused in web surveys (2001-2015).

Page 10: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Figure 5. Web surveys topics of the collectedweb surveys (2001-2015)

Page 11: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Important Findings

• Finding 1: The response rate is higher, the more recent the year of publication of the study.

• Finding 2: There is difference in response rates among continents.

• Finding 3: There is difference in response rates among the different research companies that undertake sampling and survey administration.

• Finding 4: There is no difference in the response rate provided by surveys that use the cash incentive and the prize incentive.

Page 12: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Important Findings

• Finding 5: There is difference in the response rate between studies providing cash and prize incentives and studies providing no incentives.

• Finding 6: There is no difference in the response rate between CV and CE methods.

• Finding 7: There is no difference in the response rate produced by single mode and mixed mode studies.

• Finding 8: The response rate is independent of the topic of the survey.

Page 13: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

sStimuli & remuneration

How does the information menu appear?Has respondent effort been measured? (How manyrespondents chose to access supplementarymaterial? How long did they spend reading that?)Was there a scrolling design (1 page) or a screen byscreen design (more pages)?Were there text entry boxes or multiple choicequestions? Were there long entry boxes or shortentry boxes?Were there drop-down menus?Were respondents allowed to drop the survey andresume later?Was there a real-time presentation of the surveyresults, which allowed respondents to adapt theiranswer in order to reach the required target?How is innovativeness of the study proved?How are ethics in the study guaranteed? (e.g.Approval by ethics committee)How was confidentiality guaranteed?How was variance of scope sensitivity examined?Has the survey taken into consideration Dillman'sprinciples?How were respondents prevented from answeringmultiple times?In case of mixed mode survey, did respondents havethe option of choosing between the two modes?Has the survey explicitly explained its usefulness torespondents?Did the two mode surveys take place concurrently orsequentially?Were online focus groups and chat rooms available?Did the survey allow respondents the opportunity towrite their opinion about it? (e.g. Did they find theelicitation format easy to understand?)Was mixed mode simultaneous or consecutive?What type of paradata (if any) were observed?

Description of the sampling methodDetails of panelists' recruitment by theresearch companyHow were computer non-owners orcomputer illiterate people handled?(e.g. was there a hotline available? Didthe survey provide its own PCs?)Did the survey collect information aboutrespondents who decided not toparticipate in the survey?If there was a convenience sample:Elaborate on its representativenessIf sample was random: Elaborate on itsrandomness.Describe piloting process

How many times was a respondentinvited for participation?Questionnaire positioning: Was it sentby Email or communicated by phone?Was it embedded in a web page? Wasit communicated with a link? Was anindividual password sent?Was a screening phone call used?Were key officials contacted to sendEmails to their employees?For how long was the invitation keptalive?After how much time were initialinvitations followed by Emailreminders? Were there automaticreminders?Was the invitation reminder sent byEmail or another form ofcommunication?

Did colored photographs have an influence onrespondents?What animations, graphics, sounds, fonts wereused?Was any introductory video played?What incentive was given to participants (gift, cash,prize etc.)?If no gift was available, is that clearly stated?Was completion time measured?What was speed of return for a completedquestionnaire?Was a minimum completion time imposed?

Questionnaire design & webadministration

Sampling frame & selfselection

Invitation

Figure 8. Checklist for web-surveys

Page 14: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.
Page 15: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.
Page 16: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.
Page 17: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.
Page 18: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Figure 8. Web add-ins in the process of a typical stated

preference survey

Debrief

A non-response sample can be better analyzed and correlated to its demographics.Trace response contradictions to debrief questions compared to answers in main questions (also with cross-tabulations).Collection of paradata is possible (e.g. completion time, number of clicks, number of changes made, use of help functionsetc).Text areas can be added to those who wish to fill them.Web panel data allow for comparisons of results over time, because the individual information is continually updated (e.g.Knowledge Networks updates profiling data every 2-4 months).

Main (valuation) questions

Experiment between scrollable and screen by screen designs.Upon completion of a WTP question, the web design can immediately calculate how much this amounts is in the personalbudget and remind the respondent of his budget constraint in order to reduce overstatement and make the exercise morerealistic.Offer clever filters of don't know responses.Offers discrete environment.Can offer a hotline for guidance and technical support.Experiment on issues such as questionnaire length, framing, order, scope, open vs closed questions etc.Can estimate respondent fatigue and adapt questionnaire to prevent satisficing.Avoid multiple answers when only one is required.Reduces information and questionnaire space with drop-down menus. before the end of the survey.

Piloting

Online focus groups and chat rooms are possible.Pre-test surveys with multiple browsers and screen settings.

Warm-up questions

Can screen respondent panel further with questions that aim to find out whether one is in the right frame of mind to answer,or record his sentiments.Keep respondents motivated with progress indicator features on the screen.

Information session

Better visualization (photos, videos), colours, sounds, fonts.The respondent can click on the information and have it read aloud and clearly.The information can expand with various pop-up tools (depending on how much information the person needs based on thelevel of information he already has at hand).The information can be heard from a voice instead of being given a separate instructions manual. Also the instructions can begiven step by step when needed.

Sampling frame

It is performed with web panels (mainly recruited from research companies etc).Personalized invitations and reminders.

Page 19: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Figure 8. Web add-ins in the process of a typical stated

preference survey

Debrief

A non-response sample can be better analyzed and correlated to its demographics.Trace response contradictions to debrief questions compared to answers in main questions (also with cross-tabulations).Collection of paradata is possible (e.g. completion time, number of clicks, number of changes made, use of help functionsetc).Text areas can be added to those who wish to fill them.Web panel data allow for comparisons of results over time, because the individual information is continually updated (e.g.Knowledge Networks updates profiling data every 2-4 months).

Main (valuation) questions

Experiment between scrollable and screen by screen designs.Upon completion of a WTP question, the web design can immediately calculate how much this amounts is in the personalbudget and remind the respondent of his budget constraint in order to reduce overstatement and make the exercise morerealistic.Offer clever filters of don't know responses.Offers discrete environment.Can offer a hotline for guidance and technical support.Experiment on issues such as questionnaire length, framing, order, scope, open vs closed questions etc.Can estimate respondent fatigue and adapt questionnaire to prevent satisficing.Avoid multiple answers when only one is required.Reduces information and questionnaire space with drop-down menus. before the end of the survey.

Piloting

Online focus groups and chat rooms are possible.Pre-test surveys with multiple browsers and screen settings.

Warm-up questions

Can screen respondent panel further with questions that aim to find out whether one is in the right frame of mind to answer,or record his sentiments.Keep respondents motivated with progress indicator features on the screen.

Information session

Better visualization (photos, videos), colours, sounds, fonts.The respondent can click on the information and have it read aloud and clearly.The information can expand with various pop-up tools (depending on how much information the person needs based on thelevel of information he already has at hand).The information can be heard from a voice instead of being given a separate instructions manual. Also the instructions can begiven step by step when needed.

Sampling frame

It is performed with web panels (mainly recruited from research companies etc).Personalized invitations and reminders.

Page 20: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Figure 8. Web add-ins in the process of a typical stated

preference survey

Debrief

A non-response sample can be better analyzed and correlated to its demographics.Trace response contradictions to debrief questions compared to answers in main questions (also with cross-tabulations).Collection of paradata is possible (e.g. completion time, number of clicks, number of changes made, use of help functionsetc).Text areas can be added to those who wish to fill them.Web panel data allow for comparisons of results over time, because the individual information is continually updated (e.g.Knowledge Networks updates profiling data every 2-4 months).

Main (valuation) questions

Experiment between scrollable and screen by screen designs.Upon completion of a WTP question, the web design can immediately calculate how much this amounts is in the personalbudget and remind the respondent of his budget constraint in order to reduce overstatement and make the exercise morerealistic.Offer clever filters of don't know responses.Offers discrete environment.Can offer a hotline for guidance and technical support.Experiment on issues such as questionnaire length, framing, order, scope, open vs closed questions etc.Can estimate respondent fatigue and adapt questionnaire to prevent satisficing.Avoid multiple answers when only one is required.Reduces information and questionnaire space with drop-down menus. before the end of the survey.

Piloting

Online focus groups and chat rooms are possible.Pre-test surveys with multiple browsers and screen settings.

Warm-up questions

Can screen respondent panel further with questions that aim to find out whether one is in the right frame of mind to answer,or record his sentiments.Keep respondents motivated with progress indicator features on the screen.

Information session

Better visualization (photos, videos), colours, sounds, fonts.The respondent can click on the information and have it read aloud and clearly.The information can expand with various pop-up tools (depending on how much information the person needs based on thelevel of information he already has at hand).The information can be heard from a voice instead of being given a separate instructions manual. Also the instructions can begiven step by step when needed.

Sampling frame

It is performed with web panels (mainly recruited from research companies etc).Personalized invitations and reminders.

Page 21: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Characteristics Face-to-face (personal)

Telephone Mail CAPI Web

Cost 3 2 1Response rate 1 2 3Design richness 2 2 1Representativeness 1 2 3Suitability for sensitive topics 3 2 1Prone to warm glow effect 3 2 1Self-selection 1 2 3Data handling possibilities (e.g. input, transformation etc)

3 2 1

Table 7. Web survey comparison to other modes of survey with respect to eight characteristics

Page 22: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Full Technology

Face to faceinterview

Telephoneinterview

CATI

CAPI

WEBMail survey

Zero technology

No humaninteraction

Full humaninteraction

Figure 9. Survey modes box

Page 23: Department of Environmental Engineering Democritus University of Thrace Angeliki, N. Menegaki 1, Søren Bøye Olsen 2, and Konstantinos P. Tsagarakis 3 3.

Acknowledgement

The authors acknowledge the insights from the participation to the WEBDATANET (COST Action IS1004) network (http://webdatanet.cbs.dk/).


Recommended