Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources

Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen:
https://osnadocs.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-2017032715716
Open Access logo originally created by the Public Library of Science (PLoS)
Langanzeige der Metadaten
DC ElementWertSprache
dc.creatorWahn, Basil
dc.creatorKönig, Peter
dc.date.accessioned2017-03-27T06:59:39Z
dc.date.available2017-03-27T06:59:39Z
dc.date.issued2017-03-27T06:59:39Z
dc.identifier.citationFrontiers in Integrative Neuroscience,Vol. 10, Article 13, 2016, S. 1-13
dc.identifier.urihttps://osnadocs.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-2017032715716-
dc.description.abstractHumans constantly process and integrate sensory input from multiple sensory modalities. However, the amount of input that can be processed is constrained by limited attentional resources. A matter of ongoing debate is whether attentional resources are shared across sensory modalities, and whether multisensory integration is dependent on attentional resources. Previous research suggested that the distribution of attentional resources across sensory modalities depends on the the type of tasks. Here, we tested a novel task combination in a dual task paradigm: Participants performed a self-terminated visual search task and a localization task in either separate sensory modalities (i.e., haptics and vision) or both within the visual modality. Tasks considerably interfered. However, participants performed the visual search task faster when the localization task was performed in the tactile modality in comparison to performing both tasks within the visual modality. This finding indicates that tasks performed in separate sensory modalities rely in part on distinct attentional resources. Nevertheless, participants integrated visuotactile information optimally in the localization task even when attentional resources were diverted to the visual search task. Overall, our findings suggest that visual search and tactile localization partly rely on distinct attentional resources, and that optimal visuotactile integration is not dependent on attentional resources.eng
dc.relationhttp://journal.frontiersin.org/article/10.3389/fnint.2016.00013/full
dc.rightsNamensnennung 4.0 International-
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/-
dc.subjectattentional loadeng
dc.subjectmultisensory integrationeng
dc.subjectvisual modalityeng
dc.subjecttactile modalityeng
dc.subjectattentional resourceseng
dc.subjectvisual searcheng
dc.subjecttactile displayeng
dc.subject.ddc610 - Medizin und Gesundheit
dc.titleAttentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resourceseng
dc.typeEinzelbeitrag in einer wissenschaftlichen Zeitschrift [article]
dc.identifier.doi10.3389/fnint.2016.00013
vCard.ORGFB8
Enthalten in den Sammlungen:FB08 - Hochschulschriften
Open-Access-Publikationsfonds

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
Zeitschriftenartikel_Front_Integr_Neurosci_10_13_2016_Wahn.pdf1,92 MBAdobe PDF
Zeitschriftenartikel_Front_Integr_Neurosci_10_13_2016_Wahn.pdf
Miniaturbild
Öffnen/Anzeigen


Diese Ressource wurde unter folgender Copyright-Bestimmung veröffentlicht: Lizenz von Creative Commons Creative Commons