Dynamic Voice Clones Elicit Consumer Trust

dc.contributor.authorSchanke, Scott
dc.contributor.authorBurtch, Gordon
dc.contributor.authorRay, Gautam
dc.date.accessioned2021-12-24T17:59:00Z
dc.date.available2021-12-24T17:59:00Z
dc.date.issued2022-01-04
dc.description.abstractPlatforms today are experimenting with many novel personalization technologies. We explore one such technology here, voice-based conversational agents, with a focus on consumer trust. We consider the joint role of two key design / implementation choices, namely i) disclosing an agent’s autonomous nature to the user, and ii) aesthetic personalization, in the form of user voice cloning. We report on a set of controlled experiments based on the investment game, evaluating how these design choices affect subjects’ willingness to participate in the game against an autonomous, AI-enabled partner. We find no evidence that disclosure affects trust. However, we find that the greatest level of trust is elicited when a voice-based agent employs a clone of the subject’s voice. Mechanism explorations based on post-experiment survey responses indicate that voice-cloning induces trust by eliciting a perception of homophily; the voice-clone induces subjects to personify the agent and picture it as demographically similar.
dc.format.extent10 pages
dc.identifier.doi10.24251/HICSS.2022.538
dc.identifier.isbn978-0-9981331-5-7
dc.identifier.urihttp://hdl.handle.net/10125/79875
dc.language.isoeng
dc.relation.ispartofProceedings of the 55th Hawaii International Conference on System Sciences
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subjectCrowd-based Platforms
dc.subjectdeep fake
dc.subjectdeep learning
dc.subjectpersonalization
dc.subjectvoice clone
dc.titleDynamic Voice Clones Elicit Consumer Trust
dc.type.dcmitext

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
0434.pdf
Size:
357.78 KB
Format:
Adobe Portable Document Format