TY - INPR A1 - Ringers, Christa A1 - Bialonski, Stephan A1 - Solovev, Anton A1 - Hansen, Jan N. A1 - Ege, Mert A1 - Friedrich, Benjamin M. A1 - Jurisch-Yaksi, Nathalie T1 - Preprint: Local synchronization of cilia and tissue-scale cilia alignment are sufficient for global metachronal waves T2 - bioRxiv N2 - Motile cilia are hair-like cell extensions present in multiple organs of the body. How cilia coordinate their regular beat in multiciliated epithelia to move fluids remains insufficiently understood, particularly due to lack of rigorous quantification. We combine here experiments, novel analysis tools, and theory to address this knowledge gap. We investigate collective dynamics of cilia in the zebrafish nose, due to its conserved properties with other ciliated tissues and its superior accessibility for non-invasive imaging. We revealed that cilia are synchronized only locally and that the size of local synchronization domains increases with the viscosity of the surrounding medium. Despite the fact that synchronization is local only, we observed global patterns of traveling metachronal waves across the multiciliated epithelium. Intriguingly, these global wave direction patterns are conserved across individual fish, but different for left and right nose, unveiling a chiral asymmetry of metachronal coordination. To understand the implications of synchronization for fluid pumping, we used a computational model of a regular array of cilia. We found that local metachronal synchronization prevents steric collisions and improves fluid pumping in dense cilia carpets, but hardly affects the direction of fluid flow. In conclusion, we show that local synchronization together with tissue-scale cilia alignment are sufficient to generate metachronal wave patterns in multiciliated epithelia, which enhance their physiological function of fluid pumping. Y1 - 2021 U6 - https://doi.org/10.1101/2021.11.23.469646 N1 - Veröffentlicht in eLife 12:e77701 (https://doi.org/10.7554/eLife.77701). ER - TY - INPR A1 - Grieger, Niklas A1 - Mehrkanoon, Siamak A1 - Bialonski, Stephan T1 - Preprint: Data-efficient sleep staging with synthetic time series pretraining T2 - arXiv N2 - Analyzing electroencephalographic (EEG) time series can be challenging, especially with deep neural networks, due to the large variability among human subjects and often small datasets. To address these challenges, various strategies, such as self-supervised learning, have been suggested, but they typically rely on extensive empirical datasets. Inspired by recent advances in computer vision, we propose a pretraining task termed "frequency pretraining" to pretrain a neural network for sleep staging by predicting the frequency content of randomly generated synthetic time series. Our experiments demonstrate that our method surpasses fully supervised learning in scenarios with limited data and few subjects, and matches its performance in regimes with many subjects. Furthermore, our results underline the relevance of frequency information for sleep stage scoring, while also demonstrating that deep neural networks utilize information beyond frequencies to enhance sleep staging performance, which is consistent with previous research. We anticipate that our approach will be advantageous across a broad spectrum of applications where EEG data is limited or derived from a small number of subjects, including the domain of brain-computer interfaces. Y1 - 2024 ER - TY - INPR A1 - Bremm, Florian A1 - Blaneck, Patrick Gustav A1 - Bornheim, Tobias A1 - Grieger, Niklas A1 - Bialonski, Stephan T1 - Preprint: Detecting sexism in German online newspaper comments with open-source text embeddings BT - (Team GDA, GermEval2024 Shared Task 1: GerMS-Detect, Subtasks 1 and 2, Closed Track) T2 - arXiv N2 - Sexism in online media comments is a pervasive challenge that often manifests subtly, complicating moderation efforts as interpretations of what constitutes sexism can vary among individuals. We study monolingual and multilingual open-source text embeddings to reliably detect sexism and misogyny in Germanlanguage online comments from an Austrian newspaper. We observed classifiers trained on text embeddings to mimic closely the individual judgements of human annotators. Our method showed robust performance in the GermEval 2024 GerMS-Detect Subtask 1 challenge, achieving an average macro F1 score of 0.597 (4th place, as reported on Codabench). It also accurately predicted the distribution of human annotations in GerMS-Detect Subtask 2, with an average Jensen-Shannon distance of 0.301 (2nd place). The computational efficiency of our approach suggests potential for scalable applications across various languages and linguistic contexts. Y1 - 2024 U6 - https://doi.org/10.48550/arXiv.2403.08592 ER - TY - INPR A1 - Bornheim, Tobias A1 - Grieger, Niklas A1 - Blaneck, Patrick Gustav A1 - Bialonski, Stephan T1 - Preprint: Speaker attribution in German parliamentary debates with QLoRA-adapted large language models T2 - arXiv N2 - The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems. Y1 - 2023 U6 - https://doi.org/10.48550/arXiv.2309.09902 N1 - Veröffentlichte Version verfügbar unter: https://doi.org/10.21248/jlcl.37.2024.244 ER -