INFLUENCES OF SENSORY MODALITY AND STIMULUS TYPE ON PROCESSING TASK-IRRELEVANT STIMULI

Date
2019
Authors
Walker, Maegen
Contributor
Advisor
Sinnett, Scott
Department
Psychology
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
Ending Page
Alternative Title
Abstract
The processing of unattended, task-irrelevant, stimuli that are frequently presented in temporal-alignment with an attended target (i.e., target-aligned or TA) in an attention-demanding task is often facilitated. This facilitation results in higher recognition rates for TA items compared to other unattended items, presented with equal frequency, that do not appear in temporal-alignment with an attended target (i.e., non-aligned or NA), when encountered later during a surprise recognition test. Previous investigations exploring the facilitated processing for unattended TA items have traditionally focused on word processing in the visual modality. Relatively few studies have explicitly examined the role of sensory modality or the types of stimuli being presented when evaluating the extent to which unattended information may be facilitated in an attention-demanding task. Because humans live in a multisensory environment in which we are exposed to a variety of stimuli, it is important to investigate the extent to which processing for different types of items may be facilitated when unattended and whether or not the sensory modality in which these stimuli are presented plays a role in facilitation. Arguably, vision and audition are the two most dominant sensory modalities for humans, and two main forms of information that are often encountered in daily life include lexical (i.e., words, both written and spoken) and non-lexical information (i.e., pictures and sounds). The experiments presented in this doctoral dissertation explore the role that sensory modality (vision and audition) and stimulus type (lexical and non-lexical) have in the facilitated processing of unattended, task-irrelevant stimuli. Experiment 1 focused on comparing facilitation rates for lexical and non-lexical stimuli under unimodal visual conditions (Experiment 1a) and unimodal auditory conditions (Experiment 1b). It was hypothesized that all TA items would undergo facilitated processing leading to higher recognition rates compared to NA items and that non-lexical information would be preferentially facilitated, in general, compared to lexical information. Collectively, the results of Experiment 1 demonstrate that under unimodal visual conditions, the facilitatory effects of target-alignment remain robust, with higher recognition rates for TA items compared to NA items observed regardless of stimulus type. Additionally, unattended non-lexical items (i.e., pictures) appear to be processed more extensively (i.e., facilitated), leading to higher recognition rates overall, compared to lexical items (i.e., written words), and the impact of target-alignment remains uniform across these stimulus dimensions. However, when presented under unimodal auditory conditions (Experiment 1b), target-alignment does not appear to play a critical role in facilitation as TA items were not recognized more often when compared to NA items. Additionally, there was no significant difference in recognition rates for unattended non-lexical items (i.e., sounds) when compared to lexical items (i.e., auditory words), suggesting that stimulus type also has little bearing on the facilitated processing of auditory items that were unattended and task-irrelevant. Experiment 2 extended the unimodal stimulus presentation conditions from the first experiment to cross-modal conditions, while comparing facilitation rates for lexical and non-lexical stimuli. There were two conditions; an auditory/visual condition in which the unattended dimension was presented in the visual modality while performing an attention-demanding auditory task (Experiment 2a), and a visual/auditory condition in which the unattended dimension was presented in the auditory modality while performing an attention-demanding visual task (Experiment 2b). As with Experiment 1, a main effect for target-alignment was anticipated, as was a main effect for stimulus type, under both conditions (auditory/visual and visual/auditory). Results from Experiment 2 mirrored those of Experiment 1. In Experiment 2a, the effects of target-alignment and stimulus type remained robust for unattended visual information when presented concurrently with an attention-demanding auditory task. Specifically, TA items were recognized more often compared to NA items and non-lexical items (i.e., pictures) were recognized more often when compared to lexical items (i.e., written words). In Experiment 2b, the effects of target-alignment and stimulus type remained inconsequential. TA items were not recognized more often than NA items and there was no difference in recognition rates between lexical (i.e., auditory words) and non-lexical items (i.e., sounds). Taken together, Experiments 1 and 2 demonstrate that facilitated processing for unattended lexical and non-lexical information may proceed differently depending on the sensory modality in which those items are presented. In the visual modality, target-alignment facilitates processing for unattended items, regardless of stimulus type, and, regardless of target-alignment, unattended pictures appear to be processed more extensively than unattended written words. In the auditory modality, target-alignment does not appear to play a critical role in the processing of unattended information, as there was no difference in recognition rates between TA and NA items. Furthermore, there appears to be no difference in the extent to which unattended auditory words and sounds are processed. The underpinning theoretical rationale for the divergent patterns in facilitation rates of lexical and non-lexical stimuli between the two sensory modalities are extensively explored in this dissertation.
Description
Keywords
Cognitive psychology, Attention, Cognition, Facilitation, Multisensory Processing, Task-Irrelevant Stimuli
Citation
Extent
193 pages
Format
Geographic Location
Time Period
Related To
Table of Contents
Rights
Rights Holder
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.