Download PDFOpen PDF in browserSelf-Attention Long-Term Dependency Modelling in Electroencephalography Sleep Stage PredictionEasyChair Preprint 691012 pages•Date: October 20, 2021AbstractComplex sleep stage transition rules pose a challenge for the learning of inter-epoch context with Deep Neural Networks (DNNs) in ElectroEncephaloGraphy (EEG) based sleep scoring. While DNNs were able to overcome the limits of expert systems, the dominant bidirectional Long Short-Term Memory (LSTM) still has some limitations of Recurrent Neural Networks. We propose a sleep Self-Attention Model (SAM) that replaces LSTMs for inter-epoch context modelling in a sleep scoring DNN. With the ability to access distant EEG as easily as adjacent EEG, we aim to improve long-term dependency learning for critical sleep stages such as Rapid Eye Movement (REM). Restricting attention to a local scope reduces computational complexity to a linear one with respect to recording duration. We evaluate SAM on two public sleep EEG datasets: MASS-SS3 and SEDF-78 and compare it to literature and an LSTM baseline model via a paired t-test. On MASS-SS3 SAM achieves kappa = 0.80, which is equivalent to the best reported result, with no significant difference to baseline. On SEDF-78 SAM achieves kappa = 0.78, surpassing previous best results, statistically significant, with +4% F1-score improvement in REM. Strikingly, SAM achieves these results with a model size that is at least 50 times smaller than the baseline. Keyphrases: Sleep Scoring, attention, inter-epoch context
|