Visual-Auditory Multimodal Decoding from Multi-Channel Human Intracranial EEG
Keywords:
Brain-machine interface, Neural data analysisAbstract
The human brain's response to video stimuli involves the engagement of numerous sensory modalities, including visual and auditory systems. Nevertheless, the neural underpinnings governing the processing of video stimuli remain incompletely understood. One critical factor contributing to this knowledge gap is the scarcity of corresponding human electrophysiology datasets, particularly those collected from deep brain regions. This work presented an intracranial EEG (iEEG) dataset acquired from 66 electrodes in an epilepsy patient over the course of 300 minutes while they viewed customized video stimuli. The dataset encompassed a wide range of cortical and subcortical brain regions, with the goal of enhancing our comprehension of the perceptual processing mechanisms involved during video observation. Based on this dataset, we employed a classification experiment to assess the significance of brain regions involved in the processing of diverse videos, auditory stimuli, and visual scenes within the human brain.
Downloads
Published
Issue
Track Selection
License
Copyright (c) 2025 The Authors(s)

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.