Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (302)
- Fachbereich Energietechnik (259)
- Fachbereich Medizintechnik und Technomathematik (243)
- Fachbereich Maschinenbau und Mechatronik (209)
- Fachbereich Luft- und Raumfahrttechnik (208)
- Solar-Institut Jülich (167)
- IfB - Institut für Bioengineering (152)
- Fachbereich Bauingenieurwesen (139)
- Fachbereich Wirtschaftswissenschaften (73)
- ECSM European Center for Sustainable Mobility (62)
Language
- English (1170)
- German (477)
- Italian (1)
- Multiple languages (1)
- Spanish (1)
Document Type
- Conference Proceeding (1650) (remove)
Keywords
- Biosensor (25)
- Blitzschutz (15)
- CAD (11)
- Finite-Elemente-Methode (11)
- civil engineering (11)
- Bauingenieurwesen (10)
- Lightning protection (9)
- Einspielen <Werkstoff> (6)
- Telekommunikationsmarkt (6)
- shakedown analysis (6)
The progress in natural language processing (NLP) research over the last years, offers novel business opportunities for companies, as automated user interaction or improved data analysis. Building sophisticated NLP applications requires dealing with modern machine learning (ML) technologies, which impedes enterprises from establishing successful NLP projects. Our experience in applied NLP research projects shows that the continuous integration of research prototypes in production-like environments with quality assurance builds trust in the software and shows convenience and usefulness regarding the business goal. We introduce STAMP 4 NLP as an iterative and incremental process model for developing NLP applications. With STAMP 4 NLP, we merge software engineering principles with best practices from data science. Instantiating our process model allows efficiently creating prototypes by utilizing templates, conventions, and implementations, enabling developers and data scientists to focus on the business goals. Due to our iterative-incremental approach, businesses can deploy an enhanced version of the prototype to their software environment after every iteration, maximizing potential business value and trust early and avoiding the cost of successful yet never deployed experiments.
In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm.
In the study, the process chain of additive manufacturing by means of powder bed fusion will be presented based on the material glass. In order to reliably process components additively, new concepts with different solutions were developed and investigated.
Compared to established metallic materials, the properties of glass materials differ significantly. Therefore, the process control was adapted to the material glass in the investigations. With extensive parameter studies based on various glass powders such as borosilicate glass and quartz glass, scientifically proven results on powder bed fusion of glass are presented. Based on the determination of the particle properties with different methods, extensive investigations are made regarding the melting behavior of glass by means of laser beams. Furthermore, the experimental setup was steadily expanded. In addition to the integration of coaxial temperature measurement and regulation, preheating of the building platform is of major importance. This offers the possibility to perform 3D printing at the transformation temperatures of the glass materials. To improve the component’s properties, the influence of a subsequent heat treatment was also investigated.
The experience gained was incorporated into a new experimental system, which allows a much better exploration of the 3D printing of glass. Currently, studies are being conducted to improve surface texture, building accuracy, and geometrical capabilities using three-dimensional specimen.
The contribution shows the development of research in the field of 3D printing of glass, gives an insight into the machine and process engineering as well as an outlook on the possibilities and applications.
A new formulation to calculate the shakedown limit load of Kirchhoff plates under stochastic conditions of strength is developed. Direct structural reliability design by chance con-strained programming is based on the prescribed failure probabilities, which is an effective approach of stochastic programming if it can be formulated as an equivalent deterministic optimization problem. We restrict uncertainty to strength, the loading is still deterministic. A new formulation is derived in case of random strength with lognormal distribution. Upper bound and lower bound shakedown load factors are calculated simultaneously by a dual algorithm.
Project work and inter disciplinarity are integral parts of today's engineering work. It is therefore important to incorporate these aspects into the curriculum of academic studies of engineering. At the faculty of Electrical Engineering and Information Technology an interdisciplinary project is part of the bachelor program to address these topics. Since the summer term 2020 most courses changed to online mode during the Covid-19 crisis including the interdisciplinary projects. This online mode introduces additional challenges to the execution of the projects, both for the students as well as for the lecture. The challenges, but also the risks and chances of this kind of project courses are subject of this paper, based on five different interdisciplinary projects
During the Covid-19 pandemic, vocational colleges, universities of applied science and technical universities often had to cancel laboratory sessions requiring students’ attendance. These above of all are of decisive importance in order to give learners an understanding of theory through practical work.This paper is a contribution to the implementation of distance learning for laboratory work applicable for several upper secondary educational facilities. Its aim is to provide a paradigm for hybrid teaching to analyze and control a non-linear system depicted by a tank model. For this reason, we redesign a full series of laboratory sessions on the basis of various challenges. Thus, it is suitable to serve different reference levels of the European Qualifications Framework (EQF).We present problem-based learning through online platforms to compensate the lack of a laboratory learning environment. With a task deduced from their future profession, we give students the opportunity to develop own solutions in self-defined time intervals. A requirements specification provides the framework conditions in terms of time and content for students having to deal with the challenges of the project in a self-organized manner with regard to inhomogeneous previous knowledge. If the concept of Complete Action is introduced in classes before, they will automatically apply it while executing the project.The goal is to combine students’ scientific understanding with a procedural knowledge. We suggest a series of remote laboratory sessions that combine a problem formulation from the subject area of Measurement, Control and Automation Technology with a project assignment that is common in industry by providing extracts from a requirements specification.
The Robot Operating System (ROS) is the current de-facto standard in robot middlewares. The steadily increasing size of the user base results in a greater demand for training as well. User groups range from students in academia to industry professionals with a broad spectrum of developers in between. To deliver high quality training and education to any of these audiences, educators need to tailor individual curricula for any such training. In this paper, we present an approach to ease compiling curricula for ROS trainings based on a taxonomy of the teaching contents. The instructor can select a set of dedicated learning units and the system will automatically compile the teaching material based on the dependencies of the units selected and a set of parameters for a particular training. We walk through an example training to illustrate our work.
In this paper, we present the structure, the simulation the operation of a multi-stage, hybrid solar desalination system (MSDH), powered by thermal and photovoltaic (PV) (MSDH) energy. The MSDH system consists of a lower basin, eight horizontal stages, a field of four flat thermal collectors with a total area of 8.4 m2, 3 Kw PV panels and solar batteries. During the day the system is heated by thermal energy, and at night by heating resistors, powered by solar batteries. These batteries are charged by the photovoltaic panels during the day. More specifically, during the day and at night, we analyse the temperature of the stages and the production of distilled water according to the solar irradiation intensity and the electric heating power, supplied by the solar batteries. The simulations were carried out in the meteorological conditions of the winter month (February 2020), presenting intensities of irradiance and ambient temperature reaching 824 W/m2 and 23 °C respectively. The results obtained show that during the day the system is heated by the thermal collectors, the temperature of the stages and the quantity of water produced reach 80 °C and 30 Kg respectively. At night, from 6p.m. the system is heated by the electric energy stored in the batteries, the temperature of the stages and the quantity of water produced reach respectively 90 °C and 104 Kg for an electric heating power of 2 Kw. Moreover, when the electric power varies from 1 Kw to 3 Kw the quantity of water produced varies from 92 Kg to 134 Kg. The analysis of these results and their comparison with conventional solar thermal desalination systems shows a clear improvement both in the heating of the stages, by 10%, and in the quantity of water produced by a factor of 3.
The utilization of phase change material (PCM) for latent heat storage and thermal control of spacecraft has been demonstrated in the past in few missions only. One limiting factor was the fact that all concepts developed so far envisioned the PCM to be applied as an additional capacitor, encapsulated in its own housing, leading to mass, efficiency and accommodation challenges. Recently, the application of PCM within the scan cavity of a GEOS type satellite has been suggested, in order to tackle thermal issues due to direct sun intrusion (Choi, M., 2014). However, the application of PCM in such complex mechanical structures is extremely challenging. A new concept to tackle this issue is currently under development at the FH Aachen University of Applied Sciences. The concept "Infused Thermal Solutions (ITS)" is based on the idea to 3D print metallic structures in their regular functional shape, but double walled with internal lattice support structures, allowing the infusion of a PCM layer directly into the voids and eliminating the need for additional parts and interfaces. Together with OHB System, FH Aachen theoretically studied the application of this technology to the Meteosat Third Generation (MTG) Infra-Red Sounder (IRS) instrument. The study focuses on the scan cavity and entrance baffling assembly (EBA) of the IRS. It consists of thermal analyses, 3D-redesign and bread boarding of a scaled and PCM infused EBA version. In the thermal design of the alternative EBA, PCM was applied directly into the EBA, simulating the worst hot case sun intrusion of the mission. By applying 4kg of PCM (to a 60kg baffle) the EBA temperature excursions during sun intrusion were limited from 140K to 30K, leading to a significant thermo-opto-elastic performance gain. This paper introduces the ITS concept development status.
Bitcoin is a cryptocurrency and is considered a high-risk asset class whose price changes are difficult to predict. Current research focusses on daily price movements with a limited number of predictors. The paper at hand aims at identifying measurable indicators for Bitcoin price movements and the development of a suitable forecasting model for hourly changes. The paper provides three research contributions. First, a set of significant indicators for predicting the Bitcoin price is identified. Second, the results of a trained Long Short-term Memory (LSTM) neural network that predicts price changes on an hourly basis is presented and compared with other algorithms. Third, the results foster discussions of the applicability of neural nets for stock price predictions. In total, 47 input features for a period of over 10 months could be retrieved to train a neural net that predicts the Bitcoin price movements with an error rate of 3.52 %.
Thematisch widmet sich das Projekt Coolplan- AIR der Fortentwicklung und Feldvalidierung eines Berechnungs- und Auslegungstools zur energieeffizienten Kühlung von Gebäuden mit luftgestützten Systemen. Neben dem Aufbau und der Weiterentwicklung von Simulationsmodellen erfolgen Vermessungen der Gesamtsysteme anhand von Praxisanlagen im Feld. Der Schwerpunkt des Projekts liegt auf der Vermessung, Simulation und Integration rein luftgestützter Kühltechnologien. Im Bereich der Kälteerzeugung wurden Luft‐ Luft‐ Wärmepumpen, Anlagen zur adiabaten Kühlung bzw. offene Kühltürme und VRF‐ Multisplit‐ Systeme (Variable Refrigerant Flow) im Feld bzw. auf dem Teststand der HSD vermessen. Die Komponentenmodelle werden in die Matlab/Simulink‐ Toolbox CARNOT integriert und anschließend auf Basis der zuvor erhaltenen Messdaten validiert.
Einerseits erlauben die Messungen das Betriebsverhalten von Anlagenkomponenten zu analysieren. Andererseits soll mit der Vermessung im Feld geprüft werden, inwieweit die Simulationsmodelle, welche im Vorgängerprojekt aus Prüfstandmessungen entwickelt wurden, auch für größere Geräteleistungen Gültigkeit besitzen. Die entwickelten und implementierten Systeme, bestehend aus verschiedensten Anlagenmodellen und Regelungskomponenten, werden geprüft und dahingehend qualifiziert, dass sie in Standard- Auslegungstools zuverlässig verwendet werden können.
Zusätzlich wird ein energetisches Monitoring eines Hörsaalgebäudes am Campus Jülich durchgeführt, das u. a. zur Validierung der Kühllastberechnungen in gängigen Simulationsmodelle genutzt werden kann.
Im Projekt Coolplan‐ AIR geht es um die Fortentwicklung und Feld‐ Validierung eines Berechnungs‐ und Auslegungstools zur energieeffizienten Kühlung von Gebäuden mit luftgestützten Systemen. Neben dem Aufbau und der Weiterentwicklung von Simulationsmodellen erfolgen Vermessungen der Gesamtsysteme anhand von Praxisanlagen im Feld. Eine der betrachteten Anlagen arbeitet mit indirekter Verdunstung. Diese Veröffentlichung zeigt den Entwicklungsprozess und den Aufbau des Simulationsmodells zur Verdunstungskühlung in der Simulationsumgebung Matlab‐ Simulink mit der CARNOT‐ Toolbox. Das besondere Augenmerk liegt dabei auf dem physikalischen Modell des Wärmeübertragers, in dem die Verdunstung implementiert ist. Dem neuen Modellansatz liegt die Annahme einer aus der Enthalpie‐ Betrachtung hergeleiteten effektiven Wärmekapazität zugrunde. Des Weiteren wird der Befeuchtungsgrad als konstant angesehen und eine standardisierte Zunahme der Wärmeübertragung des feuchten gegenüber dem trockenen Wärmeübertrager angenommen. Die Validierung des Modells erfolgte anhand von Literaturdaten. Für den trockenen Wärmetauscher ist der maximale absolute Fehler der berechneten Austrittstemperatur (Zuluft) kleiner als ±0.1 K und für den nassen Wärmetauscher (Kühlfall) unter der Annahme eines konstanten Verdunstungsgrades kleiner als ±0.4 K.
The paper presents an overview of the past and present of low-emission combustor research with hydrogen-rich fuels at Aachen University of Applied Sciences. In 1990, AcUAS started developing the Dry-Low-NOx Micromix combustion technology. Micromix reduces NOx emissions using jet-in-crossflow mixing of multiple miniaturized fuel jets and combustor air with an inherent safety against flashback. At first, pure hydrogen as fuel was investigated with lab-scale applications. Later, Micromix prototypes were developed for the use in an industrial gas turbine Honeywell/Garrett GTCP-36-300, proving low NOx characteristics during real gas turbine operation, accompanied by the successful definition of safety laws and control system modifications. Further, the Micromix was optimized for the use in annular and can combustors as well as for fuel-flexibility with hydrogen-methane-mixtures and hydrogen-rich syngas qualities by means of extensive experimental and numerical simulations. In 2020, the latest Micromix application will be demonstrated in a commercial 2 MW-class gas turbine can-combustor with full-scale engine operation. The paper discusses the advances in Micromix research over the last three decades.