Refine
Year of publication
- 2023 (116) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (29)
- Fachbereich Elektrotechnik und Informationstechnik (21)
- Fachbereich Luft- und Raumfahrttechnik (20)
- ECSM European Center for Sustainable Mobility (18)
- Fachbereich Chemie und Biotechnologie (16)
- Fachbereich Energietechnik (13)
- INB - Institut für Nano- und Biotechnologien (11)
- IfB - Institut für Bioengineering (9)
- Fachbereich Wirtschaftswissenschaften (8)
- Fachbereich Maschinenbau und Mechatronik (7)
Language
- English (116) (remove)
Document Type
- Article (66)
- Conference Proceeding (35)
- Part of a Book (6)
- Habilitation (2)
- Preprint (2)
- Talk (2)
- Book (1)
- Conference: Meeting Abstract (1)
- Contribution to a Periodical (1)
Keywords
- Information extraction (3)
- Natural language processing (3)
- Associated liquids (2)
- Bacillaceae (2)
- Biotechnological application (2)
- CFD (2)
- Diversity Management (2)
- Engineering Habitus (2)
- Future Skills (2)
- Interdisciplinarity (2)
The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems.
Software development projects often fail because of insufficient code quality. It is now well documented that the task of testing software, for example, is perceived as uninteresting and rather boring, leading to poor software quality and major challenges to software development companies. One promising approach to increase the motivation for considering software quality is the use of gamification. Initial research works already investigated the effects of gamification on software developers and come to promising. Nevertheless, a lack of results from field experiments exists, which motivates the chapter at hand. By conducting a gamification experiment with five student software projects and by interviewing the project members, the chapter provides insights into the changing programming behavior of information systems students when confronted with a leaderboard. The results reveal a motivational effect as well as a reduction of code smells.
Immunosorbent turnip vein clearing virus (TVCV) particles displaying the IgG-binding domains D and E of Staphylococcus aureus protein A (PA) on every coat protein (CP) subunit (TVCVPA) were purified from plants via optimized and new protocols. The latter used polyethylene glycol (PEG) raw precipitates, from which virions were selectively re-solubilized in reverse PEG concentration gradients. This procedure improved the integrity of both TVCVPA and the wild-type subgroup 3 tobamovirus. TVCVPA could be loaded with more than 500 IgGs per virion, which mediated the immunocapture of fluorescent dyes, GFP, and active enzymes. Bi-enzyme ensembles of cooperating glucose oxidase and horseradish peroxidase were tethered together on the TVCVPA carriers via a single antibody type, with one enzyme conjugated chemically to its Fc region, and the other one bound as a target, yielding synthetic multi-enzyme complexes. In microtiter plates, the TVCVPA-displayed sugar-sensing system possessed a considerably increased reusability upon repeated testing, compared to the IgG-bound enzyme pair in the absence of the virus. A high coverage of the viral adapters was also achieved on Ta2O5 sensor chip surfaces coated with a polyelectrolyte interlayer, as a prerequisite for durable TVCVPA-assisted electrochemical biosensing via modularly IgG-assembled sensor enzymes.
Hydrogen peroxide (H₂O₂), a strong oxidizer, is a commonly used sterilization agent employed during aseptic food processing and medical applications. To assess the sterilization efficiency with H₂O₂, bacterial spores are common microbial systems due to their remarkable robustness against a wide variety of decontamination strategies. Despite their widespread use, there is, however, only little information about the detailed time-resolved mechanism underlying the oxidative spore death by H₂O₂. In this work, we investigate chemical and morphological changes of individual Bacillus atrophaeus spores undergoing oxidative damage using optical sensing with trapping Raman microscopy in real-time. The time-resolved experiments reveal that spore death involves two distinct phases: (i) an initial phase dominated by the fast release of dipicolinic acid (DPA), a major spore biomarker, which indicates the rupture of the spore’s core; and (ii) the oxidation of the remaining spore material resulting in the subsequent fragmentation of the spores’ coat. Simultaneous observation of the spore morphology by optical microscopy corroborates these mechanisms. The dependence of the onset of DPA release and the time constant of spore fragmentation on H₂O₂ shows that the formation of reactive oxygen species from H₂O₂ is the rate-limiting factor of oxidative spore death.
Clinical assessment of newly developed sensors is important for ensuring their validity. Comparing recordings of emerging electrocardiography (ECG) systems to a reference ECG system requires accurate synchronization of data from both devices. Current methods can be inefficient and prone to errors. To address this issue, three algorithms are presented to synchronize two ECG time series from different recording systems: Binned R-peak Correlation, R-R Interval Correlation, and Average R-peak Distance. These algorithms reduce ECG data to their cyclic features, mitigating inefficiencies and minimizing discrepancies between different recording systems. We evaluate the performance of these algorithms using high-quality data and then assess their robustness after manipulating the R-peaks. Our results show that R-R Interval Correlation was the most efficient, whereas the Average R-peak Distance and Binned R-peak Correlation were more robust against noisy data.
The first and last mile of a railway journey, in both freight and transit applications, constitutes a high effort and is either non-productive (e.g. in the case of depot operations) or highly inefficient (e.g. in industrial railways). These parts are typically managed on-sight, i.e. with no signalling and train protection systems ensuring the freedom of movement. This is possible due to the rather short braking distances of individual vehicles and shunting consists. The present article analyses the braking behaviour of such shunting units. For this purpose, a dedicated model is developed. It is calibrated on published results of brake tests and validated against a high-definition model for low-speed applications. Based on this model, multiple simulations are executed to obtain a Monte Carlo simulation of the resulting braking distances. Based on the distribution properties and established safety levels, the risk of exceeding certain braking distances is evaluated and maximum braking distances are derived. Together with certain parameters of the system, these can serve in the design and safety assessment of driver assistance systems and automation of these processes.
The RoboCup Logistics League (RCLL) is a robotics competition in a production logistics scenario in the context of a Smart Factory. In the competition, a team of three robots needs to assemble products to fulfill various orders that are requested online during the game. This year, the Carologistics team was able to win the competition with a new approach to multi-agent coordination as well as significant changes to the robot’s perception unit and a pragmatic network setup using the cellular network instead of WiFi. In this paper, we describe the major components of our approach with a focus on the changes compared to the last physical competition in 2019.
Due to the increasing complexity of software projects, software development is becoming more and more dependent on teams. The quality of this teamwork can vary depending on the team composition, as teams are always a combination of different skills and personality types. This paper aims to answer the question of how to describe a software development team and what influence the personality of the team members has on the team dynamics. For this purpose, a systematic literature review (n=48) and a literature search with the AI research assistant Elicit (n=20) were conducted. Result: A person’s personality significantly shapes his or her thinking and actions, which in turn influences his or her behavior in software development teams. It has been shown that team performance and satisfaction can be strongly influenced by personality. The quality of communication and the likelihood of conflict can also be attributed to personality.
This paper presents an approach for reducing the cognitive load for humans working in quality control (QC) for production processes that adhere to the 6σ -methodology. While 100% QC requires every part to be inspected, this task can be reduced when a human-in-the-loop QC process gets supported by an anomaly detection system that only presents those parts for manual inspection that have a significant likelihood of being defective. This approach shows good results when applied to image-based QC for metal textile products.
Experimental determination of the cross sections of proton capture on radioactive nuclei is extremely difficult. Therefore, it is of substantial interest for the understanding of the production of the p-nuclei. For the first time, a direct measurement of proton-capture cross sections on stored, radioactive ions became possible in an energy range of interest for nuclear astrophysics. The experiment was performed at the Experimental Storage Ring (ESR) at GSI by making use of a sensitive method to measure (p,γ) and (p,n) reactions in inverse kinematics. These reaction channels are of high relevance for the nucleosyn-thesis processes in supernovae, which are among the most violent explosions in the universe and are not yet well understood. The cross section of the ¹¹⁸Te(p,γ) reaction has been measured at energies of 6 MeV/u and 7 MeV/u. The heavy ions interacted with a hydrogen gas jet target. The radiative recombination process of the fully stripped ¹¹⁸Te ions and electrons from the hydrogen target was used as a luminosity monitor. An overview of the experimental method and preliminary results from the ongoing analysis will be presented.