Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1686)
- Fachbereich Elektrotechnik und Informationstechnik (717)
- IfB - Institut für Bioengineering (620)
- Fachbereich Energietechnik (587)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (551)
- Fachbereich Luft- und Raumfahrttechnik (495)
- Fachbereich Maschinenbau und Mechatronik (278)
- Fachbereich Wirtschaftswissenschaften (217)
- Solar-Institut Jülich (164)
- Fachbereich Bauingenieurwesen (157)
- ECSM European Center for Sustainable Mobility (88)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (67)
- Nowum-Energy (30)
- Fachbereich Gestaltung (25)
- Institut fuer Angewandte Polymerchemie (23)
- Sonstiges (21)
- Fachbereich Architektur (20)
- Freshman Institute (18)
- Kommission für Forschung und Entwicklung (18)
- ZHQ - Bereich Hochschuldidaktik und Evaluation (8)
- Arbeitsstelle fuer Hochschuldidaktik und Studienberatung (3)
- IMP - Institut für Mikrowellen- und Plasmatechnik (3)
- FH Aachen (2)
- IaAM - Institut für angewandte Automation und Mechatronik (2)
- Kommission für Planung und Finanzen (2)
- Digitalisierung in Studium & Lehre (1)
- IBB - Institut für Baustoffe und Baukonstruktionen (1)
Language
- English (4901) (remove)
Document Type
- Article (3272)
- Conference Proceeding (1161)
- Part of a Book (190)
- Book (144)
- Doctoral Thesis (30)
- Conference: Meeting Abstract (27)
- Patent (25)
- Other (10)
- Report (9)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
Lead and nickel, as heavy metals, are still used in industrial processes, and are classified as “environmental health hazards” due to their toxicity and polluting potential. The detection of heavy metals can prevent environmental pollution at toxic levels that are critical to human health. In this sense, the electrolyte–insulator–semiconductor (EIS) field-effect sensor is an attractive sensing platform concerning the fabrication of reusable and robust sensors to detect such substances. This study is aimed to fabricate a sensing unit on an EIS device based on Sn₃O₄ nanobelts embedded in a polyelectrolyte matrix of polyvinylpyrrolidone (PVP) and polyacrylic acid (PAA) using the layer-by-layer (LbL) technique. The EIS-Sn₃O₄ sensor exhibited enhanced electrochemical performance for detecting Pb²⁺ and Ni²⁺ ions, revealing a higher affinity for Pb²⁺ ions, with sensitivities of ca. 25.8 mV/decade and 2.4 mV/decade, respectively. Such results indicate that Sn₃O₄ nanobelts can contemplate a feasible proof-of-concept capacitive field-effect sensor for heavy metal detection, envisaging other future studies focusing on environmental monitoring.
Ice melting probes
(2023)
The exploration of icy environments in the solar system, such as the poles of Mars and the icy moons (a.k.a. ocean worlds), is a key aspect for understanding their astrobiological potential as well as for extraterrestrial resource inspection. On these worlds, ice melting probes are considered to be well suited for the robotic clean execution of such missions. In this chapter, we describe ice melting probes and their applications, the physics of ice melting and how the melting behavior can be modeled and simulated numerically, the challenges for ice melting, and the required key technologies to deal with those challenges. We also give an overview of existing ice melting probes and report some results and lessons learned from laboratory and field tests.
Even the shortest flight through unknown, cluttered environments requires reliable local path planning algorithms to avoid unforeseen obstacles. The algorithm must evaluate alternative flight paths and identify the best path if an obstacle blocks its way. Commonly, weighted sums are used here. This work shows that weighted Chebyshev distances and factorial achievement scalarising functions are suitable alternatives to weighted sums if combined with the 3DVFH* local path planning algorithm. Both methods considerably reduce the failure probability of simulated flights in various environments. The standard 3DVFH* uses a weighted sum and has a failure probability of 50% in the test environments. A factorial achievement scalarising function, which minimises the worst combination of two out of four objective functions, reaches a failure probability of 26%; A weighted Chebyshev distance, which optimises the worst objective, has a failure probability of 30%. These results show promise for further enhancements and to support broader applicability.
KNX is a protocol for smart building automation, e.g., for automated heating, air conditioning, or lighting. This paper analyses and evaluates state-of-the-art KNX devices from manufacturers Merten, Gira and Siemens with respect to security. On the one hand, it is investigated if publicly known vulnerabilities like insecure storage of passwords in software, unencrypted communication, or denialof-service attacks, can be reproduced in new devices. On the other hand, the security is analyzed in general, leading to the discovery of a previously unknown and high risk vulnerability related to so-called BCU (authentication) keys.
Selected problems in the field of multivariate statistical analysis are treated. Thereby, one focus is on the paired sample case. Among other things, statistical testing problems of marginal homogeneity are under consideration. In detail, properties of Hotelling‘s T² test in a special parametric situation are obtained. Moreover, the nonparametric problem of marginal homogeneity is discussed on the basis of possibly incomplete data. In the bivariate data case, properties of the Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic on the basis of partly not identically distributed data are investigated. Similar testing problems are treated within the scope of the application of a result for the empirical process of the concomitants for partly categorial data. Furthermore, testing changes in the modeled solvency capital requirement of an insurance company by means of a paired sample from an internal risk model is discussed. Beyond the paired sample case, a new asymptotic relative efficiency concept based on the expected volumes of multidimensional confidence regions is introduced. Besides, a new approach for the treatment of the multi-sample goodness-of-fit problem is presented. Finally, a consistent test for the treatment of the goodness-of-fit problem is developed for the background of huge or infinite dimensional data.
The present work aimed to study the mainstream feasibility of the deammonifying sludge of side stream of municipal wastewater treatment plant (MWWTP) in Kaster, Germany. For this purpose, the deammonifying sludge available at the side stream was investigated for nitrogen (N) removal with respect to the operational factors temperature (15–30°C), pH value (6.0–8.0) and chemical oxygen demand (COD)/N ratio (≤1.5–6.0). The highest and lowest N-removal rates of 0.13 and 0.045 kg/(m³ d) are achieved at 30 and 15°C, respectively. Different conditions of pH and COD/N ratios in the SBRs of Partial nitritation/anammox (PN/A) significantly influenced both the metabolic processes and associated N-removal rates. The scientific insights gained from the current work signifies the possibility of mainstream PN/A at WWTPs. The current study forms a solid basis of operational window for the upcoming semi-technical trails to be conducted prior to the full-scale mainstream PN/A at WWTP Kaster and WWTPs globally.
The connective tissues such as tendons contain an extracellular matrix (ECM) comprising collagen fibrils scattered within the ground substance. These fibrils are instrumental in lending mechanical stability to tissues. Unfortunately, our understanding of how collagen fibrils reinforce the ECM remains limited, with no direct experimental evidence substantiating current theories. Earlier theoretical studies on collagen fibril reinforcement in the ECM have relied predominantly on the assumption of uniform cylindrical fibers, which is inadequate for modelling collagen fibrils, which possessed tapered ends. Recently, Topçu and colleagues published a paper in the International Journal of Solids and Structures, presenting a generalized shear-lag theory for the transfer of elastic stress between the matrix and fibers with tapered ends. This paper is a positive step towards comprehending the mechanics of the ECM and makes a valuable contribution to formulating a complete theory of collagen fibril reinforcement in the ECM.
To fulfil the CO2 emission reduction targets of the European Union (EU), heavy-duty (HD) trucks need to operate 15% more efficiently by 2025 and 30% by 2030. Their electrification is necessary as conventional HD trucks are already optimized for the long-haul application. The resulting hybrid electric vehicle (HEV) truck gains most of the fuel saving potential by the recuperation of potential energy and its consecutive utilization. The key to utilizing the full potential of HEV-HD trucks is to maximize the amount of recuperated energy and ensure its intelligent usage while keeping the operating point of the internal combustion engine as efficient as possible. To achieve this goal, an intelligent energy management strategy (EMS) based on ECMS is developed for a parallel HEV-HD truck which uses predictive discharge of the battery and adaptive operating strategy regarding the height profile and the vehicle mass. The presented EMS can reproduce the global optimal operating strategy over long phases and lead to a fuel saving potential of up to 2% compared with a heuristic strategy. Furthermore, the fuel saving potential is correlated with the investigated boundary conditions to deepen the understanding of the impact of intelligent EMS for HEV-HD trucks.
Messenger apps like WhatsApp and Telegram are frequently used for everyday communication, but they can also be utilized as a platform for illegal activity. Telegram allows public groups with up to 200.000 participants. Criminals use these public groups for trading illegal commodities and services, which becomes a concern for law enforcement agencies, who manually monitor suspicious activity in these chat rooms. This research demonstrates how natural language processing (NLP) can assist in analyzing these chat rooms, providing an explorative overview of the domain and facilitating purposeful analyses of user behavior. We provide a publicly available corpus of annotated text messages with entities and relations from four self-proclaimed black market chat rooms. Our pipeline approach aggregates the extracted product attributes from user messages to profiles and uses these with their sold products as features for clustering. The extracted structured information is the foundation for further data exploration, such as identifying the top vendors or fine-granular price analyses. Our evaluation shows that pretrained word vectors perform better for unsupervised clustering than state-of-the-art transformer models, while the latter is still superior for sequence labeling.
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.