The MAD-EEG Dataset is a research corpus for studying EEG-based auditory attention decoding to a target instrument in polyphonic music. The dataset consists of 20-channel EEG responses to music recorded from 8 subjects while attending to a particular instrument in a music mixture. For further details, please refer to the paper: MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music. If you use the data in your research, please reference the paper (not just the Zenodo record): @inproceedings{Cantisani2019, author={Giorgia Cantisani and Gabriel Trégoat and Slim Essid and Gaël Richard}, title={{MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic musi...
Music information retrieval (MIR) methods offer interesting possibilities for automatically identify...
Item does not contain fulltextIn the current study we use electroencephalography (EEG) to detect hea...
Rapid changes in the stimulus envelope (indicating tone onsets) elicit an N1-P2 ERP response, as has...
International audienceWe present MAD-EEG, a new, freely available dataset for studying EEG-based aud...
Polyphonic music (music consisting of several instruments playing in parallel) is an intuitive way o...
The article provides an open-source Music Listening- Genre (MUSIN-G) EEG dataset which contains 20 p...
This dataset contains EEG recordings from 18 subjects listening to continuous sound, either speech o...
This dataset accompanies the publication by Nicolaou et al. (2017), "Directed motor-auditory EEG con...
Summary: This dataset contains electroencephalographic recordings of 12 subjects listening to music...
Note onsets in music are acoustic landmarks providing auditory cues that underlie the per-ception of...
This dataset is associated with the manuscript "Nonlinear decoding models enable music reconstructio...
Dataset setting out to investigate neural responses to continuous musical pieces with bipolar EEG. A...
The objective of our research is to develop Brain-Computer Interfacing (BCI) for musical application...
Music information retrieval (MIR) methods offer interesting possibilities for automatically identify...
Music information retrieval (MIR) methods offer interesting possibilities for automatically identify...
Item does not contain fulltextIn the current study we use electroencephalography (EEG) to detect hea...
Rapid changes in the stimulus envelope (indicating tone onsets) elicit an N1-P2 ERP response, as has...
International audienceWe present MAD-EEG, a new, freely available dataset for studying EEG-based aud...
Polyphonic music (music consisting of several instruments playing in parallel) is an intuitive way o...
The article provides an open-source Music Listening- Genre (MUSIN-G) EEG dataset which contains 20 p...
This dataset contains EEG recordings from 18 subjects listening to continuous sound, either speech o...
This dataset accompanies the publication by Nicolaou et al. (2017), "Directed motor-auditory EEG con...
Summary: This dataset contains electroencephalographic recordings of 12 subjects listening to music...
Note onsets in music are acoustic landmarks providing auditory cues that underlie the per-ception of...
This dataset is associated with the manuscript "Nonlinear decoding models enable music reconstructio...
Dataset setting out to investigate neural responses to continuous musical pieces with bipolar EEG. A...
The objective of our research is to develop Brain-Computer Interfacing (BCI) for musical application...
Music information retrieval (MIR) methods offer interesting possibilities for automatically identify...
Music information retrieval (MIR) methods offer interesting possibilities for automatically identify...
Item does not contain fulltextIn the current study we use electroencephalography (EEG) to detect hea...
Rapid changes in the stimulus envelope (indicating tone onsets) elicit an N1-P2 ERP response, as has...