Show simple item record

dc.contributor.authorAudrezet, Alice
dc.contributor.authorParguel, Béatrice
dc.date.accessioned2018-06-22T17:41:18Z
dc.date.available2018-06-22T17:41:18Z
dc.date.issued2017
dc.identifier.issn1566-6379
dc.identifier.urihttps://basepub.dauphine.fr/handle/123456789/17874
dc.language.isoenen
dc.subjectattitude certaintyen
dc.subjectprocessing fluencyen
dc.subjectweb evaluationen
dc.subjectonline data collectionen
dc.subjecttool designen
dc.subjectinstrumenten
dc.subject.ddc658.8en
dc.subject.classificationjelM.M3.M31en
dc.titleDesigning measurement tools to improve fluency and certainty: The case of online customer satisfaction evaluationen
dc.typeArticle accepté pour publication ou publié
dc.description.abstractenOnline shopping development went hand in hand with online self-administered customer satisfaction evaluation requirement. However, the specific context of online rating, without any face-to-face clarification, raises the question of accuracy and appropriateness of the chosen tool for respondents. To address this issue, this research proposes the new concept of “response fluency” to qualify the ease with which a question is processed. Applied to the Evaluative Space Grid,a new grid that has been proposed in psychology to measure overall evaluation, this research shows how response fluency mediates the influence of measurement tool design on response certainty. More specifically, it tests the effects of two alternative tool design formats (i.e., a reduction of the grid’s response cell number and the display of labels to the response cells) in terms of response fluency and certainty. Using a between-subjects experiment, we show that the display of labels in the cells actually increases response fluency and, in turn, response certainty. By contrast, reducing the response cell number does not produce any effect. We contend that well-designed measurement tools can make the process of responding more fluent and increase respondents’ subjective confidence in their capability of conveying their true evaluations. In the end, this work advocates for new research to design measurement tools likely to engage respondents when answering surveys and prevent dropout rates, which is especially challenging within self-administered electronic settings.en
dc.relation.isversionofjnlnameElectronic Journal of Information Systems Evaluation
dc.relation.isversionofjnlvol20en
dc.relation.isversionofjnlissue2en
dc.relation.isversionofjnldate2017
dc.relation.isversionofjnlpages116-127en
dc.relation.isversionofjnlpublisherAcademic Conferences and Publishing International Limiteden
dc.subject.ddclabelMarketingen
dc.relation.forthcomingnonen
dc.relation.forthcomingprintnonen
dc.description.ssrncandidatenonen
dc.description.halcandidateouien
dc.description.readershiprechercheen
dc.description.audienceInternationalen
dc.relation.Isversionofjnlpeerreviewednonen
dc.relation.Isversionofjnlpeerreviewednonen
dc.date.updated2018-06-21T13:05:44Z
hal.person.labIds30584
hal.person.labIds1032
hal.identifierhal-01821838*


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record