Conference Proceeding
Refine
Year of publication
- 2024 (9)
- 2023 (35)
- 2022 (46)
- 2021 (48)
- 2020 (46)
- 2019 (74)
- 2018 (64)
- 2017 (66)
- 2016 (66)
- 2015 (71)
- 2014 (51)
- 2013 (57)
- 2012 (59)
- 2011 (44)
- 2010 (48)
- 2009 (53)
- 2008 (37)
- 2007 (44)
- 2006 (60)
- 2005 (23)
- 2004 (22)
- 2003 (22)
- 2002 (25)
- 2001 (12)
- 2000 (12)
- 1999 (7)
- 1998 (8)
- 1997 (8)
- 1996 (4)
- 1995 (4)
- 1993 (6)
- 1992 (3)
- 1991 (2)
- 1990 (1)
- 1989 (3)
- 1988 (3)
- 1986 (1)
- 1985 (2)
- 1984 (3)
- 1983 (2)
- 1981 (2)
- 1980 (1)
- 1979 (1)
- 1978 (3)
- 1975 (2)
- 1973 (2)
Institute
- Fachbereich Elektrotechnik und Informationstechnik (234)
- Fachbereich Medizintechnik und Technomathematik (210)
- Fachbereich Luft- und Raumfahrttechnik (183)
- Fachbereich Energietechnik (177)
- IfB - Institut für Bioengineering (148)
- Solar-Institut Jülich (110)
- Fachbereich Maschinenbau und Mechatronik (107)
- Fachbereich Bauingenieurwesen (75)
- ECSM European Center for Sustainable Mobility (52)
- Fachbereich Wirtschaftswissenschaften (51)
Language
- English (1162) (remove)
Document Type
- Conference Proceeding (1162) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Blitzschutz (6)
- Enterprise Architecture (5)
- Clusterion (4)
- Energy storage (4)
- Gamification (4)
This study presents the concept of AstroBioLab, an autonomous astrobiological field laboratory tailored for the exploration of (sub)glacial habitats. AstroBioLab is an integral component of the TRIPLE (Technologies for Rapid Ice Penetration and subglacial Lake Exploration) DLR-funded project, aimed at advancing astrobiology research through the development and deployment of innovative technologies. AstroBioLab integrates diverse measurement techniques such as fluorescence microscopy, DNA sequencing and fluorescence spectrometry, while leveraging microfluidics for efficient sample delivery and preparation.
Attitude and Orbital Dynamics Modeling for an Uncontrolled Solar-Sail Experiment in Low-Earth Orbit
(2015)
Gossamer-1 is the first project of the three-step Gossamer roadmap, the purpose of which is to develop, prove and demonstrate that solar-sail technology is a safe and reliable propulsion technique for long-lasting and high-energy missions. This paper firstly presents the structural analysis performed on the sail to understand its elastic behavior. The results are then used in attitude and orbital simulations. The model considers the main forces and torques that a satellite experiences in low-Earth orbit coupled with the sail deformation. Doing the simulations for varying initial conditions in attitude and rotation rate, the results show initial states to avoid and maximum rotation rates reached for correct and faulty deployment of the sail. Lastly comparisons with the classic flat sail model are carried out to test the hypothesis that the elastic behavior does play a role in the attitude and orbital behavior of the sail
Close interrelations between sound and image are not a mere phenomenon of today’s multimedia technology. The idea of the synthesis of different media lies at the core of the concept of the Gesamtkunstwerk in the second half of the 19th century and it can also be traced back to the synaesthesia debate at the beginning of the 20th century [...].
Having well-defined control strategies for fuel cells, that can efficiently detect errors and take corrective action is critically important for safety in all applications, and especially so in aviation. The algorithms not only ensure operator safety by monitoring the fuel cell and connected components, but also contribute to extending the health of the fuel cell, its durability and safe operation over its lifetime. While sensors are used to provide peripheral data surrounding the fuel cell, the internal states of the fuel cell cannot be directly measured. To overcome this restriction, Kalman Filter has been implemented as an internal state observer.
Other safety conditions are evaluated using real-time data from every connected sensor and corrective actions automatically take place to ensure safety. The algorithms discussed in this paper have been validated thorough Model-in-the-Loop (MiL) tests as well as practical validation at a dedicated test bench.
Combined with the use of renewable energy sources for
its production, Hydrogen represents a possible alternative gas
turbine fuel for future low emission power generation. Due to
its different physical properties compared to other fuels such
as natural gas, well established gas turbine combustion
systems cannot be directly applied for Dry Low NOx (DLN)
Hydrogen combustion. This makes the development of new
combustion technologies an essential and challenging task
for the future of hydrogen fueled gas turbines.
The newly developed and successfully tested “DLN
Micromix” combustion technology offers a great potential to
burn hydrogen in gas turbines at very low NOx emissions.
Aiming to further develop an existing burner design in terms
of increased energy density, a redesign is required in order to
stabilise the flames at higher mass flows and to maintain low
emission levels.
For this purpose, a systematic design exploration has
been carried out with the support of CFD and optimisation
tools to identify the interactions of geometrical and design
parameters on the combustor performance. Aerodynamic
effects as well as flame and emission formation are observed
and understood time- and cost-efficiently. Correlations
between single geometric values, the pressure drop of the
burner and NOx production have been identified as a result.
This numeric methodology helps to reduce the effort of
manufacturing and testing to few designs for single
validation campaigns, in order to confirm the flame stability
and NOx emissions in a wider operating condition field.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
Reliable methods for automatic readability assessment have the potential to impact a variety of fields, ranging from machine translation to self-informed learning. Recently, large language models for the German language (such as GBERT and GPT-2-Wechsel) have become available, allowing to develop Deep Learning based approaches that promise to further improve automatic readability assessment. In this contribution, we studied the ability of ensembles of fine-tuned GBERT and GPT-2-Wechsel models to reliably predict the readability of German sentences. We combined these models with linguistic features and investigated the dependence of prediction performance on ensemble size and composition. Mixed ensembles of GBERT and GPT-2-Wechsel performed better than ensembles of the same size consisting of only GBERT or GPT-2-Wechsel models. Our models were evaluated in the GermEval 2022 Shared Task on Text Complexity Assessment on data of German sentences. On out-of-sample data, our best ensemble achieved a root mean squared error of 0:435.
The production of dispatchable renewable energy will be one of the most important key factors of the future energy supply. Concentrated solar power (CSP) plants operated with molten salt as heat transfer and storage media are one opportunity to meet this challenge. Due to the high concentration factor of the solar tower technology the maximum process temperature can be further increased which ultimately decreases the levelized costs of electricity of the technology (LCOE). The development of an improved tubular molten salt receiver for the next generation of molten salt solar tower plants is the aim of this work. The receiver is designed for a receiver outlet temperature up to 600 °C. Together with a complete molten salt system, the receiver will be integrated into the Multi-Focus-Tower (MFT) in Jülich (Germany). The paper describes the basic engineering of the receiver, the molten salt tower system and a laboratory corrosion setup.
Recent earthquakes showed that low-rise URM buildings following codecompliant seismic design and details behaved in general very well without substantial damages. Although advances in simulation tools make nonlinear calculation methods more readily accessible to designers, linear analyses will still be the standard design method for years to come. The present paper aims to improve the linear seismic design method by providing a proper definition of the q-factor of URM buildings. Values of q-factors are derived for low-rise URM buildings with rigid diaphragms, with reference to modern structural configurations realized in low to moderate seismic areas of Italy and Germany. The behaviour factor components for deformation and energy dissipation capacity and for overstrength due to the redistribution of forces are derived by means of pushover analyses. As a result of the investigations, rationally based values of the behaviour factor q to be used in linear analyses in the range of 2.0 to 3.0 are proposed.
The overall objective of this study is to develop a new external fixator, which closely maps the native kinematics of the elbow to decrease the joint force resulting in reduced rehabilitation time and pain. An experimental setup was designed to determine the native kinematics of the elbow during flexion of cadaveric arms. As a preliminary study, data from literature was used to modify a published biomechanical model for the calculation of the joint and muscle forces. They were compared to the original model and the effect of the kinematic refinement was evaluated. Furthermore, the obtained muscle forces were determined in order to apply them in the experimental setup. The joint forces in the modified model differed slightly from the forces in the original model. The muscle force curves changed particularly for small flexion angles but their magnitude for larger angles was consistent.
Biomechanical simulation of different prosthetic meshes for repairing uterine/vaginal vault prolapse
(2017)
Tests with palm tree leaves have just started yet and scan data are in the process to be analyzed. The final goal of future project for palm tree gender and species recognition will be to develop optical scanning technology to be applied to date palm tree leaves for in–situ screening purposes. Depending on the software used and the particular requirements of the users the technology potentially shall be able to identify palm tree diseases, palm tree gender, and species of young date palm trees by scanning leaves.
This paper reports a first microbial biosensor for rapid and cost-effective determination of organophosphorus pesticides fenitrothion and EPN. The biosensor consisted of recombinant PNP-degrading/oxidizing bacteria Pseudomonas putida JS444 anchoring and displaying organophosphorus hydrolase (OPH) on its cell surface as biological sensing element and a dissolved oxygen electrode as the transducer. Surfaceexpressed OPH catalyzed the hydrolysis of fenitrothion and EPN to release 3-methyl-4-nitrophenol and p-nitrophenol, respectively, which were oxidized by the enzymatic machinery of Pseudomonas putida JS444 to carbon dioxide while consuming oxygen, which was measured and correlated to the concentration of organophosphates. Under the optimum operating conditions, the biosensor was able to measure as low as 277 ppb of fenitrothion and 1.6 ppm of EPN without interference from phenolic compounds and other commonly used pesticides such as carbamate pesticides, triazine herbicides and organophosphate pesticides without nitrophenyl substituent. The applicability of the biosensor to lake water was also demonstrated.
In many historical centers in Europe, stone masonry is part of building aggregates, which developed when the layout of the city or village was densified. The analysis of such building aggregates is very challenging and modelling guidelines missing. Advances in the development of analysis methods have been impeded by the lack of experimental data on the seismic response of such aggregates. The SERA project AIMS (Seismic Testing of Adjacent Interacting Masonry Structures) provides such experimental data by testing an aggregate of two buildings under two horizontal components of dynamic excitation. With the aim to advance the modelling of unreinforced masonry aggregates, a blind prediction competition is organized before the experimental campaign. Each group has been provided a complete set of construction drawings, material properties, testing sequence and the list of measurements to be reported. The applied modelling approaches span from equivalent frame models to Finite Element models using shell elements and discrete element models with solid elements. This paper compares the first entries, regarding the modelling approaches, results in terms of base shear, roof displacements, interface openings, and the failure modes.
Within the framework of the project a genderand diversity-oriented teaching evaluation and modern, media-supported blended learning approaches were used in order to achieve the intended goals. First research results of the literature and status quo analysis were already implemented and tested in newly designed teaching approaches, for example in a multidisciplinary introductory lecture of civil engineering at RWTH Aachen University.
Numerical models have become an essential part of snow avalanche engineering. Recent
advances in understanding the rheology of flowing snow and the mechanics of entrainment and
deposition have made numerical models more reliable. Coupled with field observations and historical
records, they are especially helpful in understanding avalanche flow in complex terrain. However, the
application of numerical models poses several new challenges to avalanche engineers. A detailed
understanding of the avalanche phenomena is required to specify initial conditions (release zone
dimensions and snowcover entrainment rates) as well as the friction parameters, which are no longer
based on empirical back-calculations, rather terrain roughness, vegetation and snow properties. In this
paper we discuss these problems by presenting the computer model RAMMS, which was specially
designed by the SLF as a practical tool for avalanche engineers. RAMMS solves the depth-averaged
equations governing avalanche flow with first and second-order numerical solution schemes. A
tremendous effort has been invested in the implementation of advanced input and output features.
Simulation results are therefore clearly and easily visualized to simplify their interpretation. More
importantly, RAMMS has been applied to a series of well-documented avalanches to gauge model
performance. In this paper we present the governing differential equations, highlight some of the input
and output features of RAMMS and then discuss the simulation of the Gatschiefer avalanche that
occurred in April 2008, near Klosters/Monbiel, Switzerland.
Proceedings of the International Conference on Material Theory and Nonlinear Dynamics. MatDyn. Hanoi, Vietnam, Sept. 24-26, 2007, 8 p. In this paper, a method is introduced to determine the limit load of general shells using the finite element method. The method is based on an upper bound limit and shakedown analysis with elastic-perfectly plastic material model. A non-linear constrained optimisation problem is solved by using Newton’s method in conjunction with a penalty method and the Lagrangean dual method. Numerical investigation of a pipe bend subjected to bending moments proves the effectiveness of the algorithm.
The integration of frequently changing, volatile product data from different manufacturers into a single catalog is a significant challenge for small and medium-sized e-commerce companies. They rely on timely integrating product data to present them aggregated in an online shop without knowing format specifications, concept understanding of manufacturers, and data quality. Furthermore, format, concepts, and data quality may change at any time. Consequently, integrating product catalogs into a single standardized catalog is often a laborious manual task. Current strategies to streamline or automate catalog integration use techniques based on machine learning, word vectorization, or semantic similarity. However, most approaches struggle with low-quality or real-world data. We propose Attribute Label Ranking (ALR) as a recommendation engine to simplify the integration process of previously unknown, proprietary tabular format into a standardized catalog for practitioners. We evaluate ALR by focusing on the impact of different neural network architectures, language features, and semantic similarity. Additionally, we consider metrics for industrial application and present the impact of ALR in production and its limitations.
The integration of product data from heterogeneous sources and manufacturers into a single catalog is often still a laborious, manual task. Especially small- and medium-sized enterprises face the challenge of timely integrating the data their business relies on to have an up-to-date product catalog, due to format specifications, low quality of data and the requirement of expert knowledge. Additionally, modern approaches to simplify catalog integration demand experience in machine learning, word vectorization, or semantic similarity that such enterprises do not have. Furthermore, most approaches struggle with low-quality data. We propose Attribute Label Ranking (ALR), an easy to understand and simple to adapt learning approach. ALR leverages a model trained on real-world integration data to identify the best possible schema mapping of previously unknown, proprietary, tabular format into a standardized catalog schema. Our approach predicts multiple labels for every attribute of an inpu t column. The whole column is taken into consideration to rank among these labels. We evaluate ALR regarding the correctness of predictions and compare the results on real-world data to state-of-the-art approaches. Additionally, we report findings during experiments and limitations of our approach.