• search hit 9 of 2052
Back to Result List

Preprint: Data-efficient sleep staging with synthetic time series pretraining

  • Analyzing electroencephalographic (EEG) time series can be challenging, especially with deep neural networks, due to the large variability among human subjects and often small datasets. To address these challenges, various strategies, such as self-supervised learning, have been suggested, but they typically rely on extensive empirical datasets. Inspired by recent advances in computer vision, we propose a pretraining task termed "frequency pretraining" to pretrain a neural network for sleep staging by predicting the frequency content of randomly generated synthetic time series. Our experiments demonstrate that our method surpasses fully supervised learning in scenarios with limited data and few subjects, and matches its performance in regimes with many subjects. Furthermore, our results underline the relevance of frequency information for sleep stage scoring, while also demonstrating that deep neural networks utilize information beyond frequencies to enhance sleep staging performance, which is consistent with previous research. We anticipate that our approach will be advantageous across a broad spectrum of applications where EEG data is limited or derived from a small number of subjects, including the domain of brain-computer interfaces.

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Niklas Grieger, Siamak Mehrkanoon, Stephan BialonskiORCiD
Parent Title (English):arXiv
Document Type:Preprint
Language:English
Year of Completion:2024
Date of the Publication (Server):2024/03/14
Length:10 Seiten
Link:https://arxiv.org/abs/2403.08592
Zugriffsart:weltweit
Institutes:FH Aachen / Fachbereich Medizintechnik und Technomathematik