Conference Proceeding
Refine
Year of publication
- 2021 (46) (remove)
Document Type
- Conference Proceeding (46) (remove)
Language
- English (46) (remove)
Keywords
- Hydrogen (2)
- NOx emissions (2)
- Out-of-plane load (2)
- autonomous driving (2)
- building information modelling (2)
- earthquakes (2)
- electro mobility (2)
- hydrogen (2)
- industrial facilities (2)
- installations (2)
- piping (2)
- seismic loading (2)
- 3D object detection (1)
- 3D printing (1)
- 3D-printing (1)
- Adaptive Systems (1)
- Augmented Reality (1)
- BIM (1)
- Computational modeling (1)
- Deep learning (1)
Institute
- Fachbereich Energietechnik (12)
- Fachbereich Elektrotechnik und Informationstechnik (11)
- Fachbereich Luft- und Raumfahrttechnik (8)
- Fachbereich Medizintechnik und Technomathematik (7)
- Solar-Institut Jülich (6)
- ECSM European Center for Sustainable Mobility (5)
- Fachbereich Bauingenieurwesen (5)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (5)
- IfB - Institut für Bioengineering (4)
- Fachbereich Maschinenbau und Mechatronik (2)
- Arbeitsstelle fuer Hochschuldidaktik und Studienberatung (1)
- Digitalisierung in Studium & Lehre (1)
- Fachbereich Wirtschaftswissenschaften (1)
- Freshman Institute (1)
- IaAM - Institut für angewandte Automation und Mechatronik (1)
- Nowum-Energy (1)
Component failures within water supply systems can lead to significant performance losses. One way to address these losses is the explicit anticipation of failures within the design process. We consider a water supply system for high-rise buildings, where pump failures are the most likely failure scenarios. We explicitly consider these failures within an early design stage which leads to a more resilient system, i.e., a system which is able to operate under a predefined number of arbitrary pump failures. We use a mathematical optimization approach to compute such a resilient design. This is based on a multi-stage model for topology optimization, which can be described by a system of nonlinear inequalities and integrality constraints. Such a model has to be both computationally tractable and to represent the real-world system accurately. We therefore validate the algorithmic solutions using experiments on a scaled test rig for high-rise buildings. The test rig allows for an arbitrary connection of pumps to reproduce scaled versions of booster station designs for high-rise buildings. We experimentally verify the applicability of the presented optimization model and that the proposed resilience properties are also fulfilled in real systems.
For typical cases of non-isolated lightning
protection systems (LPS) the impulse currents are investigated which may flow through a human body directly touching a structural part of the LPS. Based on a basic LPS model with conventional down-conductors especially the cases of external and internal steel columns and metal façades are considered and compared. Numerical simulations of the line quantities voltages and currents in the time domain are performed with an equivalent circuit of the entire LPS.
As a result it can be stated that by increasing the number of conventional down-conductors and external steel columns the threat for a human being can indeed be reduced, but not down to an acceptable limit. In case of internal steel columns used as natural down-conductors the threat can be reduced sufficiently, depending on the low-resistive connection of the steel columns to the lightning equipotential bonding or the earth termination system, resp. If a metal façade is used the threat for human beings touching is usually very low, if the façade is sufficiently interconnected and multiply connected to the lightning equipotential bonding or the earth termination system, resp.
The course Physics for Electrical Engineering is part of the curriculum of the
bachelor program Electrical Engineering at University of Applied Science Aachen.
Before covid-19 the course was conducted in a rather traditional way with all parts
(lecture, exercise and lab) face-to-face. This teaching approach changed
fundamentally within a week when the covid-19 limitations forced all courses to
distance learning. All parts of the course were transformed to pure distance learning
including synchronous and asynchronous parts for the lecture, live online-sessions
for the exercises and self-paced labs at home. Using these methods, the course was
able to impart the required knowledge and competencies. Taking the teacher’s
observations of the student’s learning behaviour and engagement, the formal and
informal feedback of the students and the results of the exams into account, the new
methods are evaluated with respect to effectiveness, sustainability and suitability for
competence transfer. Based on this analysis strong and weak points of the concept
and countermeasures to solve the weak points were identified. The analysis further
leads to a sustainable teaching approach combining synchronous and asynchronous
parts with self-paced learning times that can be used in a very flexible manner for
different learning scenarios, pure online, hybrid (mixture of online and presence
times) and pure presence teaching.
The progress in natural language processing (NLP) research over the last years, offers novel business opportunities for companies, as automated user interaction or improved data analysis. Building sophisticated NLP applications requires dealing with modern machine learning (ML) technologies, which impedes enterprises from establishing successful NLP projects. Our experience in applied NLP research projects shows that the continuous integration of research prototypes in production-like environments with quality assurance builds trust in the software and shows convenience and usefulness regarding the business goal. We introduce STAMP 4 NLP as an iterative and incremental process model for developing NLP applications. With STAMP 4 NLP, we merge software engineering principles with best practices from data science. Instantiating our process model allows efficiently creating prototypes by utilizing templates, conventions, and implementations, enabling developers and data scientists to focus on the business goals. Due to our iterative-incremental approach, businesses can deploy an enhanced version of the prototype to their software environment after every iteration, maximizing potential business value and trust early and avoiding the cost of successful yet never deployed experiments.
In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm.
Past earthquakes demonstrated the high vulnerability of industrial facilities equipped with complex process technologies leading to serious damage of the process equipment and multiple and simultaneous release of hazardous substances in industrial facilities. Nevertheless, the design of industrial plants is inadequately described in recent codes and guidelines, as they do not consider the dynamic interaction between the structure and the installations and thus the effect of seismic response of the installations on the response of the structure and vice versa. The current code-based approach for the seismic design of industrial facilities is considered not enough for ensure proper safety conditions against exceptional event entailing loss of content and related consequences. Accordingly, SPIF project (Seismic Performance of Multi- Component Systems in Special Risk Industrial Facilities) was proposed within the framework of the European H2020 - SERA funding scheme (Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe). The objective of the SPIF project is the investigation of the seismic behavior of a representative industrial structure equipped with complex process technology by means of shaking table tests. The test structure is a three-story moment resisting steel frame with vertical and horizontal vessels and cabinets, arranged on the three levels and connected by pipes. The dynamic behavior of the test structure and installations is investigated with and without base isolation. Furthermore, both firmly anchored and isolated components are taken into account to compare their dynamic behavior and interactions with each other. Artificial and synthetic ground motions are applied to study the seismic response at different PGA levels. After each test, dynamic identification measurements are carried out to characterize the system condition. The contribution presents the numerical simulations to calibrate the tests on the prototype, the experimental setup of the investigated structure and installations, selected measurement data and finally describes preliminary experimental results.
Seismic vulnerability estimation of existing structures is unquestionably interesting topic of high priority, particularly after earthquake events. Having in mind the vast number of old masonry buildings in North Macedonia serving as public institutions, it is evident that the structural assessment of these buildings is an issue of great importance. In this paper, a comprehensive methodology for the development of seismic fragility curves of existing masonry buildings is presented. A scenario – based method that incorporates the knowledge of the tectonic style of the considered region, the active fault characterization, the earth crust model and the historical seismicity (determined via the Neo Deterministic approach) is used for calculation of the necessary response spectra. The capacity of the investigated masonry buildings has been determined by using nonlinear static analysis. MINEA software (SDA Engineering) is used for verification of the structural safety of the structures Performance point, obtained from the intersection of the capacity of the building and the spectra used, is selected as a response parameter. The thresholds of the spectral displacement are obtained by splitting the capacity curve into five parts, utilizing empirical formulas which are represented as a function of yield displacement and ultimate displacement. As a result, four levels of damage limit states are determined. A maximum likelihood estimation procedure for the process of fragility curves determination is noted as a final step in the proposed procedure. As a result, region specific series of vulnerability curves for structures are defined.