We present a new Bayesian econometric specification for a hypothetical Discrete Choice Experiment (DCE) incorporating respondent ranking information about attribute importance. Our results indicate that a DCE debriefing question that asks respondents to rank the importance of attributes helps to explain the resulting choices. We also examine how mode of survey delivery (online and mail) impacts model performance, finding that results are not substantively affected by the mode of survey delivery. We conclude that the ranking data are a complementary source of information about respondent utility functions within hypothetical DCEs.
- Attribute importance rankings
- discrete choice experiment
- survey mode