Time-Frequency and Synchrony Analysis of Responses to Steady-State Auditory and Musical Stimuli from Multichannel EEG
Brain responses to audio stimuli are analysed using data driven time-frequency analysis. This is achieved based on the electroencephalogram (EEG) recordings and with auditory chirps or music as the audio stimulus. The empirical mode decomposition (EMD) is applied to multichannel EEG recordings, and the insight into the brain responses is provided by the analysis of the dynamics of auditory steady-state responses (ASSR). The proposed approach is further illustrated on the analysis of EEG responses to classical music. A comprehensive synchrony analysis is provided based on the visualization of EMD and spectrogrammatching techniques. Simulation results illustrate the potential of the proposed approach in future brain computer/machine interfaces.