Dynamic Voice Clones Elicit Consumer Trust

Date
2022-01-04
Authors
Schanke, Scott
Burtch, Gordon
Ray, Gautam
Contributor
Advisor
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
Ending Page
Alternative Title
Abstract
Platforms today are experimenting with many novel personalization technologies. We explore one such technology here, voice-based conversational agents, with a focus on consumer trust. We consider the joint role of two key design / implementation choices, namely i) disclosing an agent’s autonomous nature to the user, and ii) aesthetic personalization, in the form of user voice cloning. We report on a set of controlled experiments based on the investment game, evaluating how these design choices affect subjects’ willingness to participate in the game against an autonomous, AI-enabled partner. We find no evidence that disclosure affects trust. However, we find that the greatest level of trust is elicited when a voice-based agent employs a clone of the subject’s voice. Mechanism explorations based on post-experiment survey responses indicate that voice-cloning induces trust by eliciting a perception of homophily; the voice-clone induces subjects to personify the agent and picture it as demographically similar.
Description
Keywords
Crowd-based Platforms, deep fake, deep learning, personalization, voice clone
Citation
Extent
10 pages
Format
Geographic Location
Time Period
Related To
Proceedings of the 55th Hawaii International Conference on System Sciences
Table of Contents
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International
Rights Holder
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.