Augmented/Virtual/Mixed Reality: A Paradigm for Immersive and Collaborative Computing

Permanent URI for this collectionhttps://hdl.handle.net/10125/112399

Browse

Recent Submissions

Now showing 1 - 5 of 5
  • Item type: Item ,
    Task-Technology Fit in Virtual Reality: Explaining Behavioral Intention to Use and Perceived Performance
    (2026-01-06) Paetow, Thomas; Wichmann, Johannes; Leyer, Michael
    While Virtual Reality (VR) is becoming more popular for professional applications, empirical evidence on its task-specific suitability remains scarce. This study employs the Task-Technology Fit (TTF) model to evaluate the perceived fit of VR for workplace tasks using McGrath’s Group Task Circumplex. Results from an online survey were analyzed with Partial Least Squares Structural Equation Modeling (PLS-SEM). They indicate that TTF relates to performance perceptions and behavioral intention to use, explaining a substantial share of variance in both outcomes. The study links task taxonomies to TTF in immersive settings and shows that task and technology characteristics both contribute to perceived fit, with social presence the most salient technology factor. Practically, the findings support task-aware selection and design of VR solutions that align with task profiles. Limitations include simulated (non-headset) stimuli and a personalization-comparability trade-off introduced by task filtering.
  • Item type: Item ,
    Seeing through Immersion: Integrating Human Computer Interaction Design Experience with Nano-Internet of Things Visualization in Safety-Critical Systems
    (2026-01-06) Rancy, Jean-Philippe; Grabowski, Martha
    Safety-critical systems demand intuitive and adaptive human-computer interfaces to support operator decision-making, situational awareness, and performance under pressure. This research-in-progress paper explores the integration of user-centered design methods with immersive display technologies and nano IoT visualization in maritime navigation. Building on prior empirical work involving wearable AR systems and head-mounted displays, we apply a design framework grounded in immersive navigation, manipulation, and control, to inform the development of a transparent, flexible nano-enabled display. Using mixed-methods research in high-fidelity simulation environments, this study analyzes user interaction across performance, communication, and cognitive workload domains. Findings will inform the development of next-generation visual interfaces designed to enhance human-machine coordination in dynamic, high-stakes settings.
  • Item type: Item ,
    Towards Effective Visual Alerts in Immersive VR Environments: Balancing Visibility and User Experience
    (2026-01-06) Olejnik-Krugły, Agnieszka; Bortko, Kamil; Jankowski, Jarosław; Bródka, Julia
    As virtual reality (VR) technologies become increasingly integrated into everyday tasks and professional environments, there is a growing need to adapt user interface (UI) design principles to immersive contexts. Traditional UX methodologies, while well-established in desktop and mobile systems, prove insufficient for addressing the perceptual and cognitive demands of VR. This study investigates design strategies for alert systems in VR interfaces, focusing on their perceptual effectiveness under varying spatial and color configurations. The findings highlight clear spatial and color-based preferences, and result in practical design recommendations for improving alert saliency and user response in immersive systems.
  • Item type: Item ,
    Extended Reality, Expanded Empathy: A Psychological Distance Perspective for Designing XR-Based Empathy Training Interventions
    (2026-01-06) Mattar, Laudy; Carillo, Kévin
    Empathy is increasingly recognized as a core 21st-century skill essential for collaboration, inclusion, and organizational well-being. As modern information technology advances, extended reality (XR), including augmented reality (AR), mixed reality (MR), and virtual reality (VR) have witnessed an increasing use in training contexts. XR-based training offers unique potential to cultivate empathy by enabling experiential experiences structured around presence, immersion, and interactivity. Yet, the absence of structured frameworks limits the efficacy and scalability of XR-based empathy training. This paper presents a conceptual framework integrating narrative design elements (character, space, narrative), experiential dimensions (presence, immersion, interactivity), and Construal Level Theory (CLT) to guide the design and evaluation of empathy-enhancing XR interventions. The framework explains how XR can reduce social, spatial, temporal, and hypothetical distance to activate affective, cognitive, and behavioral empathy. To demonstrate practical applicability, we apply the framework to two illustrative cases, confirming its value for guiding XR scenario design and evaluation. We discuss implications for XR-based empathy training, empathy-centered organizational learning, and the integration of empathy as a measurable outcome, offering clear guidance for researchers, designers, HR professionals, and learning leaders.