Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1693)
- Fachbereich Elektrotechnik und Informationstechnik (719)
- IfB - Institut für Bioengineering (624)
- Fachbereich Energietechnik (589)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (552)
- Fachbereich Luft- und Raumfahrttechnik (497)
- Fachbereich Maschinenbau und Mechatronik (283)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4935) (remove)
Document Type
- Article (3285)
- Conference Proceeding (1170)
- Part of a Book (195)
- Book (146)
- Doctoral Thesis (32)
- Conference: Meeting Abstract (29)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
At (ultra)high magnetic fields the artifact sensitivity of ECG recordings increases. This bears the risk of R-wave mis-registration which has been consistently reported for ECG triggered CMR at 7.0T. Realizing the constraints of conventional ECG, acoustic cardiac triggering (ACT) has been proposed. The clinical ACT has not been carefully examined yet. For this reason, this work scrutinizes the suitability, accuracy and reproducibility of ACT for CMR at 7.0T. For this purpose, the trigger reliability and trigger detection variance are examined together with an qualitative and quantitative assessment of image quality of the heart at 7.0T.
Assessment of RF Safety of Transmit Coils at 7 Tesla by Experimental and Numerical Procedures (490.)
(2012)
Lifting propellers are of increasing interest for Advanced Air Mobility. All propellers and rotors are initially twisted beams, showing significant extension–twist coupling and centrifugal twisting. Torsional deformations severely impact aerodynamic performance. This paper presents a novel approach to assess different reasons for torsional deformations. A reduced-order model runs large parameter sweeps with algebraic formulations and numerical solution procedures. Generic beams represent three different propeller types for General Aviation, Commercial Aviation, and Advanced Air Mobility. Simulations include solid and hollow cross-sections made of aluminum, steel, and carbon fiber-reinforced polymer. The investigation shows that centrifugal twisting moments depend on both the elastic and initial twist. The determination of the centrifugal twisting moment solely based on the initial twist suffers from errors exceeding 5% in some cases. The nonlinear parts of the torsional rigidity do not significantly impact the overall torsional rigidity for the investigated propeller types. The extension–twist coupling related to the initial and elastic twist in combination with tension forces significantly impacts the net cross-sectional torsional loads. While the increase in torsional stiffness due to initial twist contributes to the overall stiffness for General and Commercial Aviation propellers, its contribution to the lift propeller’s stiffness is limited. The paper closes with the presentation of approximations for each effect identified as significant. Numerical evaluations are necessary to determine each effect for inhomogeneous cross-sections made of anisotropic material.
Large industrial facilities and power plants often require a huge number fo information and control cables between the differnet structures. These I&C-cables can be routed in reinforced concrete cable ducts or in isolated buried cable runs. KTA 2206 is the German lightning protection standard for nuclear power plants. During the last several years considerable effort has been made to revise this standard. Despite the well established principles and design guidelines for the construction of the lightning protection system, this standard puts special emphasis on the coupling of transient overvoltages to I&C-cables.
The Monte Carlo code FLUKA is used to simulate the production of a number of positron emitting radionuclides, ¹⁸F, ¹³N, ⁹⁴Tc, ⁴⁴Sc, ⁶⁸Ga, ⁸⁶Y, ⁸⁹Zr, ⁵²Mn, ⁶¹Cu and ⁵⁵Co, on a small medical cyclotron with a proton beam energy of 13 MeV. Experimental data collected at the TR13 cyclotron at TRIUMF agree within a factor of 0.6 ± 0.4 with the directly simulated data, except for the production of ⁵⁵Co, where the simulation underestimates the experiment by a factor of 3.4 ± 0.4. The experimental data also agree within a factor of 0.8 ± 0.6 with the convolution of simulated proton fluence and cross sections from literature. Overall, this confirms the applicability of FLUKA to simulate radionuclide production at 13 MeV proton beam energy.
In this study, an online multi-sensing platform was engineered to simultaneously evaluate various process parameters of food package sterilization using gaseous hydrogen peroxide (H₂O₂). The platform enabled the validation of critical aseptic parameters. In parallel, one series of microbiological count reduction tests was performed using highly resistant spores of B. atrophaeus DSM 675 to act as the reference method for sterility validation. By means of the multi-sensing platform together with microbiological tests, we examined sterilization process parameters to define the most effective conditions with regards to the highest spore kill rate necessary for aseptic packaging. As these parameters are mutually associated, a correlation between different factors was elaborated. The resulting correlation indicated the need for specific conditions regarding the applied H₂O₂ gas temperature, the gas flow and concentration, the relative humidity and the exposure time. Finally, the novel multi-sensing platform together with the mobile electronic readout setup allowed for the online and on-site monitoring of the sterilization process, selecting the best conditions for sterility and, at the same time, reducing the use of the time-consuming and costly microbiological tests that are currently used in the food package industry.
This study presents the concept of AstroBioLab, an autonomous astrobiological field laboratory tailored for the exploration of (sub)glacial habitats. AstroBioLab is an integral component of the TRIPLE (Technologies for Rapid Ice Penetration and subglacial Lake Exploration) DLR-funded project, aimed at advancing astrobiology research through the development and deployment of innovative technologies. AstroBioLab integrates diverse measurement techniques such as fluorescence microscopy, DNA sequencing and fluorescence spectrometry, while leveraging microfluidics for efficient sample delivery and preparation.
We study the estimation of some linear functionals which are based on an unknown lifetime distribution. The observations are assumed to be generated under the semi-parametric random censorship model (SRCM), that is, a random censorship model where the conditional expectation of the censoring indicator given the observation belongs to a parametric family. Under this setup a semi-parametric estimator of the survival function was introduced by the author. If the parametric model assumption is correct, it is known that the estimated functional which is based on this semi-parametric estimator is asymptotically at least as efficient as the corresponding one which rests on the nonparametric Kaplan–Meier estimator.
In this paper we show that the estimated functional which is based on this semi-parametric estimator is asymptotically efficient with respect to the class of all regular estimators under this semi-parametric model.
Atmospheric pressure plasma-jet treatment of PAN-nonwovens—carbonization of nanofiber electrodes
(2022)
Carbon nanofibers are produced from dielectric polymer precursors such as polyacrylonitrile (PAN). Carbonized nanofiber nonwovens show high surface area and good electrical conductivity, rendering these fiber materials interesting for application as electrodes in batteries, fuel cells, and supercapacitors. However, thermal processing is slow and costly, which is why new processing techniques have been explored for carbon fiber tows. Alternatives for the conversion of PAN-precursors into carbon fiber nonwovens are scarce. Here, we utilize an atmospheric pressure plasma jet to conduct carbonization of stabilized PAN nanofiber nonwovens. We explore the influence of various processing parameters on the conductivity and degree of carbonization of the converted nanofiber material. The precursor fibers are converted by plasma-jet treatment to carbon fiber nonwovens within seconds, by which they develop a rough surface making subsequent surface activation processes obsolete. The resulting carbon nanofiber nonwovens are applied as supercapacitor electrodes and examined by cyclic voltammetry and impedance spectroscopy. Nonwovens that are carbonized within 60 s show capacitances of up to 5 F g⁻¹.
Carbon nanofiber nonwovens represent a powerful class of materials with prospective application in filtration technology or as electrodes with high surface area in batteries, fuel cells, and supercapacitors. While new precursor-to-carbon conversion processes have been explored to overcome productivity restrictions for carbon fiber tows, alternatives for the two-step thermal conversion of polyacrylonitrile precursors into carbon fiber nonwovens are absent. In this work, we develop a continuous roll-to-roll stabilization process using an atmospheric pressure microwave plasma jet. We explore the influence of various plasma-jet parameters on the morphology of the nonwoven and compare the stabilized nonwoven to thermally stabilized samples using scanning electron microscopy, differential scanning calorimetry, and infrared spectroscopy. We show that stabilization with a non-equilibrium plasma-jet can be twice as productive as the conventional thermal stabilization in a convection furnace, while producing electrodes of comparable electrochemical performance.
Attitude and Orbital Dynamics Modeling for an Uncontrolled Solar-Sail Experiment in Low-Earth Orbit
(2015)
Gossamer-1 is the first project of the three-step Gossamer roadmap, the purpose of which is to develop, prove and demonstrate that solar-sail technology is a safe and reliable propulsion technique for long-lasting and high-energy missions. This paper firstly presents the structural analysis performed on the sail to understand its elastic behavior. The results are then used in attitude and orbital simulations. The model considers the main forces and torques that a satellite experiences in low-Earth orbit coupled with the sail deformation. Doing the simulations for varying initial conditions in attitude and rotation rate, the results show initial states to avoid and maximum rotation rates reached for correct and faulty deployment of the sail. Lastly comparisons with the classic flat sail model are carried out to test the hypothesis that the elastic behavior does play a role in the attitude and orbital behavior of the sail
Close interrelations between sound and image are not a mere phenomenon of today’s multimedia technology. The idea of the synthesis of different media lies at the core of the concept of the Gesamtkunstwerk in the second half of the 19th century and it can also be traced back to the synaesthesia debate at the beginning of the 20th century [...].
Sleep scoring is a necessary and time-consuming task in sleep studies. In animal models (such as mice) or in humans, automating this tedious process promises to facilitate long-term studies and to promote sleep biology as a data-driven f ield. We introduce a deep neural network model that is able to predict different states of consciousness (Wake, Non-REM, REM) in mice from EEG and EMG recordings with excellent scoring results for out-of-sample data. Predictions are made on epochs of 4 seconds length, and epochs are classified as artifactfree or not. The model architecture draws on recent advances in deep learning and in convolutional neural networks research. In contrast to previous approaches towards automated sleep scoring, our model does not rely on manually defined features of the data but learns predictive features automatically. We expect deep learning models like ours to become widely applied in different fields, automating many repetitive cognitive tasks that were previously difficult to tackle.
Having well-defined control strategies for fuel cells, that can efficiently detect errors and take corrective action is critically important for safety in all applications, and especially so in aviation. The algorithms not only ensure operator safety by monitoring the fuel cell and connected components, but also contribute to extending the health of the fuel cell, its durability and safe operation over its lifetime. While sensors are used to provide peripheral data surrounding the fuel cell, the internal states of the fuel cell cannot be directly measured. To overcome this restriction, Kalman Filter has been implemented as an internal state observer.
Other safety conditions are evaluated using real-time data from every connected sensor and corrective actions automatically take place to ensure safety. The algorithms discussed in this paper have been validated thorough Model-in-the-Loop (MiL) tests as well as practical validation at a dedicated test bench.
Combined with the use of renewable energy sources for
its production, Hydrogen represents a possible alternative gas
turbine fuel for future low emission power generation. Due to
its different physical properties compared to other fuels such
as natural gas, well established gas turbine combustion
systems cannot be directly applied for Dry Low NOx (DLN)
Hydrogen combustion. This makes the development of new
combustion technologies an essential and challenging task
for the future of hydrogen fueled gas turbines.
The newly developed and successfully tested “DLN
Micromix” combustion technology offers a great potential to
burn hydrogen in gas turbines at very low NOx emissions.
Aiming to further develop an existing burner design in terms
of increased energy density, a redesign is required in order to
stabilise the flames at higher mass flows and to maintain low
emission levels.
For this purpose, a systematic design exploration has
been carried out with the support of CFD and optimisation
tools to identify the interactions of geometrical and design
parameters on the combustor performance. Aerodynamic
effects as well as flame and emission formation are observed
and understood time- and cost-efficiently. Correlations
between single geometric values, the pressure drop of the
burner and NOx production have been identified as a result.
This numeric methodology helps to reduce the effort of
manufacturing and testing to few designs for single
validation campaigns, in order to confirm the flame stability
and NOx emissions in a wider operating condition field.
An approach to automatically generate a dynamic energy simulation model in Modelica for a single existing building is presented. It aims at collecting data about the status quo in the preparation of energy retrofits with low effort and costs. The proposed method starts from a polygon model of the outer building envelope obtained from photogrammetrically generated point clouds. The open-source tools TEASER and AixLib are used for data enrichment and model generation. A case study was conducted on a single-family house. The resulting model can accurately reproduce the internal air temperatures during synthetical heating up and cooling down. Modelled and measured whole building heat transfer coefficients (HTC) agree within a 12% range. A sensitivity analysis emphasises the importance of accurate window characterisations and justifies the use of a very simplified interior geometry. Uncertainties arising from the use of archetype U-values are estimated by comparing different typologies, with best- and worst-case estimates showing differences in pre-retrofit heat demand of about ±20% to the average; however, as the assumptions made are permitted by some national standards, the method is already close to practical applicability and opens up a path to quickly estimate possible financial and energy savings after refurbishment.
Wind-induced operational variability is one of the major challenges for structural health monitoring of slender engineering structures like aircraft wings or wind turbine blades. Damage sensitive features often show an even bigger sensitivity to operational variability. In this study a composite cantilever was subjected to multiple mass configurations, velocities and angles of attack in a controlled wind tunnel environment. A small-scale impact damage was introduced to the specimen and the structural response measurements were repeated. The proposed damage detection methodology is based on automated operational modal analysis. A novel baseline preparation procedure is described that reduces the amount of user interaction to the provision of a single consistency threshold. The procedure starts with an indeterminate number of operational modal analysis identifications from a large number of datasets and returns a complete baseline matrix of natural frequencies and damping ratios that is suitable for subsequent anomaly detection. Mahalanobis distance-based anomaly detection is then applied to successfully detect the damage under varying severities of operational variability and with various degrees of knowledge about the present operational conditions. The damage detection capabilities of the proposed methodology were found to be excellent under varying velocities and angles of attack. Damage detection was less successful under joint mass and wind variability but could be significantly improved through the provision of the currently encountered operational conditions.
Reliable automation of the labor-intensive manual task of scoring animal sleep can facilitate the analysis of long-term sleep studies. In recent years, deep-learning-based systems, which learn optimal features from the data, increased scoring accuracies for the classical sleep stages of Wake, REM, and Non-REM. Meanwhile, it has been recognized that the statistics of transitional stages such as pre-REM, found between Non-REM and REM, may hold additional insight into the physiology of sleep and are now under vivid investigation. We propose a classification system based on a simple neural network architecture that scores the classical stages as well as pre-REM sleep in mice. When restricted to the classical stages, the optimized network showed state-of-the-art classification performance with an out-of-sample F1 score of 0.95 in male C57BL/6J mice. When unrestricted, the network showed lower F1 scores on pre-REM (0.5) compared to the classical stages. The result is comparable to previous attempts to score transitional stages in other species such as transition sleep in rats or N1 sleep in humans. Nevertheless, we observed that the sequence of predictions including pre-REM typically transitioned from Non-REM to REM reflecting sleep dynamics observed by human scorers. Our findings provide further evidence for the difficulty of scoring transitional sleep stages, likely because such stages of sleep are under-represented in typical data sets or show large inter-scorer variability. We further provide our source code and an online platform to run predictions with our trained network.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
A concept for the analysis and optimal design of reinforced concrete structures is described. It is based on a nonlinear optimization algorithm and a finite element program for linear and nonlinear analysis of structures. With the aim of minimal cost design a two stage optimization using efficient gradient algorithm is developed. The optimization problems on global (structural) and local (crosssectional) level are formulated. A parallelization concept for solving the two stage optimization problem in minimal time is presented. Examples are included to illustrate the practical use and the effectively of the parallelization in the area of engineering design.
Reliable methods for automatic readability assessment have the potential to impact a variety of fields, ranging from machine translation to self-informed learning. Recently, large language models for the German language (such as GBERT and GPT-2-Wechsel) have become available, allowing to develop Deep Learning based approaches that promise to further improve automatic readability assessment. In this contribution, we studied the ability of ensembles of fine-tuned GBERT and GPT-2-Wechsel models to reliably predict the readability of German sentences. We combined these models with linguistic features and investigated the dependence of prediction performance on ensemble size and composition. Mixed ensembles of GBERT and GPT-2-Wechsel performed better than ensembles of the same size consisting of only GBERT or GPT-2-Wechsel models. Our models were evaluated in the GermEval 2022 Shared Task on Text Complexity Assessment on data of German sentences. On out-of-sample data, our best ensemble achieved a root mean squared error of 0:435.
The eVTOL industry is a rapidly growing mass market expected to start in 2024. eVTOL compete, caused by their predicted missions, with ground-based transportation modes, including mainly passenger cars. Therefore, the automotive and classical aircraft design process is reviewed and compared to highlight advantages for eVTOL development. A special focus is on ergonomic comfort and safety. The need for further investigation of eVTOL’s crashworthiness is outlined by, first, specifying the relevance of passive safety via accident statistics and customer perception analysis; second, comparing the current state of regulation and certification; and third, discussing the advantages of integral safety and applying the automotive safety approach for eVTOL development. Integral safety links active and passive safety, while the automotive safety approach means implementing standardized mandatory full-vehicle crash tests for future eVTOL. Subsequently, possible crash impact conditions are analyzed, and three full-vehicle crash load cases are presented.
We present an automated pipeline for the generation of synthetic datasets for six-dimension (6D) object pose estimation. Therefore, a completely automated generation process based on predefined settings is developed, which enables the user to create large datasets with a minimum of interaction and which is feasible for applications with a high object variance. The pipeline is based on the Unreal 4 (UE4) game engine and provides a high variation for domain randomization, such as object appearance, ambient lighting, camera-object transformation and distractor density. In addition to the object pose and bounding box, the metadata includes all randomization parameters, which enables further studies on randomization parameter tuning. The developed workflow is adaptable to other 3D objects and UE4 environments. An exemplary dataset is provided including five objects of the Yale-CMU-Berkeley (YCB) object set. The datasets consist of 6 million subsegments using 97 rendering locations in 12 different UE4 environments. Each dataset subsegment includes one RGB image, one depth image and one class segmentation image at pixel-level.