Behavioral Economics in the Digital Economy: Digital Nudging and Interface Design

Permanent URI for this collection


Recent Submissions

Now showing 1 - 5 of 5
  • Item
    How Digital Nudges Affect Consideration Set Size and Perceived Cognitive Effort in Idea Convergence of Open Innovation Contests
    ( 2019-01-08) Boskovic-Pavkovic, Ivan ; Seeber, Isabella ; Maier, Ronald
    Open innovation initiatives are useful to acquire many ideas, but often face problems when it comes to selecting the best ideas. Idea convergence has been suggested as a first step in idea selection to filter those ideas that are worthy of further consideration. Digital nudges – digital interventions that aim at altering human behavior in a predictable way - could support convergence. However, their effects are largely unknown. This study explores how two digital nudges, selection strategy (inclusion/exclusion) and idea subset similarity (similar/random), affect the convergence outcomes consideration set size and perceived cognitive effort. We conducted a laboratory experiment with 88 students and found that guiding individuals towards an inclusion strategy results in smaller consideration sets and higher perceived cognitive effort. Moreover, presenting individuals with subsets of similar ideas resulted in smaller consideration sets. These insights are relevant for the design and use of digital nudges for convergence in open innovation environments.
  • Item
    Nudging the Classroom: Designing a Socio-Technical Artifact to Reduce Academic Procrastination
    ( 2019-01-08) Rodriguez, Joaquin ; Piccoli, Gabriele ; Bartosiak, Marcin
    Procrastination is a widespread detrimental human behavior. Virtually everyone delays the initiation or completion of important tasks at times. Some people procrastinate to the point that they become overwhelmed by their inaction. In particular, academic procrastination is estimated to afflict 70 to 90% of undergraduate college students. We adopt the design science problem-solving paradigm to pilot a socio-technical artifact that reduces academic procrastination in large college classrooms. We adopt the principles of nudging to propose three meta-requirements and nine design principles underlying the design of a chatbot that induces students into positive and self-reinforcing behaviors countering procrastination tendencies. We use a formative natural evaluation event to provide preliminary validation for the design. The pilot provides encouraging results both in terms of use of the artifact by the intended audience and of performance improvement and can therefore be used to inform future design iterations.
  • Item
    Personalised Nudging for more Data Disclosure? On the Adaption of Data Usage Policies Format to Cognitive Styles
    ( 2019-01-08) Schöning, Charlotte ; Matt, Christian ; Hess, Thomas
    While highly sensitive data like personal health information (PHI) is valuable to digital health service providers, users often remain reluctant to disclose such personal data. Research has shown that personalised nudging, i.e. nudging that adopts content to user characteristics to nudge specific actions, can successfully increase purchase intention. However, its effect on consumers’ handling of sensitive data is unclear. We apply personalised nudging in the context of personalising data usage policies and investigate whether personalised nudges that match users’ cognitive styles (i.e. the way users process information), affects individuals’ level of trust, privacy concerns, risk, and PHI disclosure. Using an online experiment in the context of mobile apps for health bonus programmes, we find that, when presentation format matches the users’ cognitive style individuals’ PHI disclosure and trust are not affected, but that individuals’ privacy concerns and risk perceptions are significantly lower.
  • Item
    Can Experience be Trusted? Investigating the Effect of Experience on Decision Biases in Crowdworking Platforms
    ( 2019-01-08) Goerzen, Thomas
    Companies increasingly involve the crowd for collective decision making and, to aggregate the decisions, they commonly average the scores. By ignoring crowdworkers’ different levels of experience and decision biases, this method may not favor the best outcome. Alternatively, decisions can be weighted in favor of the more experienced judges in the crowd. However, previous research is inconclusive as to whether more experienced individuals are any better at avoiding decision biases. To answer this question, we conduct online crowd-based experiments with a range of treatments, comparing the anchoring effect of individuals with different levels of experience. Results indicate that not only does greater experience not protect crowdworkers from the anchoring effect but it increases their confidence in their decision, compared to less experienced individuals, even if they are wrong. Our findings provide valuable insights for both researchers and practitioners interested in improving the effectiveness of crowdworking decision-making.
  • Item