Actors, Agents, and Avatars: Visualizing Digital Humans in E-Commerce and Social Media
Permanent URI for this collection
1 - 3 of 3
Item“Help! I Have a Problem” – Differences between a Humanlike and Robot-like Chatbot Avatar in Complaint Management( 2022-01-04)To distinguish from other competitors, companies have to establish good quality and price but also an excellent service policy. Especially the after-sales service should guarantee that customers having problems are supported and satisfied. Following, good complaint management is important. With the increase of economically profitable chatbots, there is a possibility to provide a 24/7 online service to the customers. To investigate what kind of chatbot avatar, which compensation, and what kind of reaction lead to a higher behavioral intention, a 2x2x2 between-subject design was conducted (N=389). The results show that the choice of the avatar, the reaction, as well as the compensation, play a decisive role in influencing user behavior and, thus, increase the probability that the customer, despite a complaint, returns and buys again from the retailer. Further, the behavioral intention can be explained by the mediating influences of anthropomorphism and the evaluation of redress.
ItemFace It, Users Don’t Care: Affinity and Trustworthiness of Imperfect Digital Humans( 2022-01-04)Digital humans are growing in application and popularity, both as avatars for people and as standalone artificial intelligence-controlled agents. While the technology to make a digital human look more realistic is improving, we know little about how realistic they need to be. Humans are exceptionally good at identifying imperfect digital reproductions of human faces, so it has been reasoned that the slightest imperfections in the visual design of digital humans may translate into reduced acceptance and effectiveness. The broadly held wisdom is that digital humans should be photorealistic and indistinguishable from real people. To examine this common belief we collected data on individuals’ affinity and trustworthiness in photorealistic digital humans when engaged in a product bidding situation, along with a human presenter with varying degrees of video imperfections. The results reveal that participants noticed some of the video imperfections, but this did not adversely affect their willingness to pay, affinity, or trust. We found that once digital humans become close to realistic, users simply do not care about visual imperfections