Conference Proceeding
Refine
Year of publication
- 2024 (12)
- 2023 (35)
- 2022 (47)
- 2021 (48)
- 2020 (46)
- 2019 (76)
- 2018 (65)
- 2017 (67)
- 2016 (66)
- 2015 (71)
- 2014 (51)
- 2013 (57)
- 2012 (59)
- 2011 (44)
- 2010 (48)
- 2009 (53)
- 2008 (37)
- 2007 (44)
- 2006 (60)
- 2005 (23)
- 2004 (22)
- 2003 (22)
- 2002 (25)
- 2001 (12)
- 2000 (12)
- 1999 (7)
- 1998 (8)
- 1997 (8)
- 1996 (4)
- 1995 (4)
- 1993 (6)
- 1992 (3)
- 1991 (2)
- 1990 (1)
- 1989 (3)
- 1988 (3)
- 1986 (1)
- 1985 (2)
- 1984 (3)
- 1983 (2)
- 1981 (2)
- 1980 (1)
- 1979 (1)
- 1978 (3)
- 1975 (2)
- 1973 (2)
Institute
- Fachbereich Elektrotechnik und Informationstechnik (234)
- Fachbereich Medizintechnik und Technomathematik (212)
- Fachbereich Luft- und Raumfahrttechnik (183)
- Fachbereich Energietechnik (177)
- IfB - Institut für Bioengineering (148)
- Solar-Institut Jülich (110)
- Fachbereich Maschinenbau und Mechatronik (107)
- Fachbereich Bauingenieurwesen (75)
- Fachbereich Wirtschaftswissenschaften (55)
- ECSM European Center for Sustainable Mobility (53)
Language
- English (1170) (remove)
Document Type
- Conference Proceeding (1170) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Blitzschutz (6)
- Enterprise Architecture (5)
- Clusterion (4)
- Energy storage (4)
- Gamification (4)
Architects and civil engineers work together regularly during their professional days and are irreplaceable for each other. This co-operation is sometimes made more difficult by the differences in their disciplinary languages and approaches. Structures are evaluated by architects on the basis of criteria such as spatial impact and usability, while civil engineers analyze them more closely by their bearing and deformation properties, as well as by constructive aspects. This diversity of assessment criteria and approaches is often continued in both academic disciplines in the view on structures.
Within the framework of the Exploratory Teaching Space (ETS), a funding program to improve teaching at RWTH Aachen University and to promote new teaching concepts, a project was carried out jointly by the Junior Professorship of Tool-Culture at the Faculty of Architecture and the Institute of Structural Concrete at the Faculty of Civil Engineering. The aim of the project is to present buildings in such a way that the differences in perception between architects and civil engineers are reduced and the common understanding is promoted.
The project develops a database, which contains a collection of striking buildings from Aachen and the surrounding area. The buildings are categorized according to terms that come from both disciplinary areas. The collection can be freely explored or crossed through learning trails. The medium of film plays a special role in presenting the buildings. The buildings are assigned to different categories of load bearing structures as linear, planar and spatial structures, and further to different types of material, functional programs and spatial characteristics. Since the buildings are located in the direct vicinity of Aachen, they can be visited by the students. This makes them more sensitive to their environment. Intrinsic motivation, as well as implicit learning is encouraged. The paper will provide a detailed report of the project, its implementation, the feedback of the students and the plans for further development.
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Limit and shakedown theorems are exact theories of classical plasticity for the direct computation of safety factors or of the load carrying capacity under constant and varying loads. Simple versions of limit and shakedown analysis are the basis of all design codes for pressure vessels and pipings. Using Finite Element Methods more realistic modeling can be used for a more rational design. The methods can be extended to yield optimum plastic design. In this paper we present a first implementation in FE of limit and shakedown analyses for perfectly plastic material. Limit and shakedown analyses are done of a pipe–junction and a interaction diagram is calculated. The results are in good correspondence with the analytic solution we give in the appendix.
Safety and reliability of structures may be assessed indirectly by stress distributions. Limit and shakedown theorems are simplified but exact methods of plasticity that provide safety factors directly in the loading space. These theorems may be used for a direct definition of the limit state function for failure by plastic collapse or by inadaptation. In a FEM formulation the limit state function is obtained from a nonlinear optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error. Moreover, the direct way leads to highly effective and precise reliability analyses. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis.
Structural design analyses are conducted with the aim of verifying the exclusion of ratcheting. To this end it is important to make a clear distinction between the shakedown range and the ratcheting range. In cyclic plasticity more sophisticated hardening models have been suggested in order to model the strain evolution observed in ratcheting experiments. The hardening models used in shakedown analysis are comparatively simple. It is shown that shakedown analysis can make quite stable predictions of admissible load ranges despite the simplicity of the underlying hardening models. A linear and a nonlinear kinematic hardening model of two-surface plasticity are compared in material shakedown analysis. Both give identical or similar shakedown ranges. Structural shakedown analyses show that the loading may have a more pronounced effect than the hardening model.
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.
The nonlinear scalar constitutive equations of gases lead to a change in sound speed from point to point as would be found in linear inhomogeneous (and time dependent) media. The nonlinear tensor constitutive equations of solids introduce the additional local effect of solution dependent anisotropy. The speed of a wave passing through a point changes with propagation direction and its rays are inclined to the front. It is an open question whether the widely used operator splitting techniques achieve a dimensional splitting with physically reasonable results for these multi-dimensional problems. May be this is the main reason why the theoretical and numerical investigations of multi-dimensional wave propagation in nonlinear solids are so far behind gas dynamics. We hope to promote the subject a little by a discussion of some fundamental aspects of the solution of the equations of nonlinear elastodynamics. We use methods of characteristics because they only integrate mathematically exact equations which have a direct physical interpretation.
This paper presents the direct route to Design by Analysis (DBA) of the new European pressure vessel standard in the language of limit and shakedown analysis (LISA). This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for some examples from the DBA-Manual. One observation from the examples is, that the optimisation approach gives reliable and close lower bound solutions leading to simple and optimised design decision.
In: Technical feasibility and reliability of passive safety systems for nuclear power plants. Proceedings of an Advisory Group Meeting held in Jülich, 21-24 November 1994. - Vienna , 1996. - Seite: 43 - 55 IAEA-TECDOC-920 Abstract: It is shown that the difficulty for probabilistic fracture mechanics (PFM) is the general problem of the high reliability of a small population. There is no way around the problem as yet. Therefore what PFM can contribute to the reliability of steel pressure boundaries is demonstrated with the example of a typical reactor pressure vessel and critically discussed. Although no method is distinguishable that could give exact failure probabilities, PFM has several additional chances. Upper limits for failure probability may be obtained together with trends for design and operating conditions. Further, PFM can identify the most sensitive parameters, improved control of which would increase reliability. Thus PFM should play a vital role in the analysis of steel pressure boundaries despite all shortcomings.
Study of swift heavy ion modified conduction polymer composites for application as gas sensor
(2006)
A polyaniline-based conducting composite was prepared by oxidative polymerisation of aniline in a polyvinylchloride (PVC) matrix. The coherent free standing thin films of the composite were prepared by a solution casting method. The polyvinyl chloride-polyaniline composites exposed to 120 MeV ions of silicon with total ion fluence ranging from 1011 to 1013 ions/cm2, were observed to be more sensitive towards ammonia gas than the unirradiated composite. The response time of the irradiated composites was observed to be comparably shorter. We report for the first time the application of swift heavy ion modified insulating polymer conducting polymer (IPCP) composites for sensing of ammonia gas.
The recently proposed NASA and ESA missions to Saturn and Jupiter pose difficult tasks to mission designers because chemical propulsion scenarios are not capable of transferring heavy spacecraft into the outer solar system without the use of gravity assists. Thus our developed mission scenario based on the joint NASA/ESA Titan Saturn System Mission baselines solar electric propulsion to improve mission flexibility and transfer time. For the calculation of near-globally optimal low-thrust trajectories, we have used a method called Evolutionary Neurocontrol, which is implemented in the low-thrust trajectory optimization software InTrance. The studied solar electric propulsion scenario covers trajectory optimization of the interplanetary transfer including variations of the spacecraft's thrust level, the thrust unit's specific impulse and the solar power generator power level. Additionally developed software extensions enabled trajectory optimization with launcher-provided hyperbolic excess energy, a complex solar power generator model and a variable specific impulse ion engine model. For the investigated mission scenario, Evolutionary Neurocontrol yields good optimization results, which also hold valid for the more elaborate spacecraft models. Compared to Cassini/Huygens, the best found solutions have faster transfer times and a higher mission flexibility in general.
Wing weight estimation methodology for highly non-planar lifting systems during conceptual design
(2013)
Micromachined thermal heater platforms offer low electrical power consumption and high modulation speed, i.e. properties which are advantageous for realizing nondispersive infrared (NDIR) gas- and liquid monitoring systems. In this paper, we report on investigations on silicon-on-insulator (SOI) based infrared (IR) emitter devices heated by employing different kinds of metallic and semiconductor heater materials. Our results clearly reveal the superior high-temperature performance of semiconductor over metallic heater materials. Long-term stable emitter operation in the vicinity of 1300 K could be attained using heavily antimony-doped tin dioxide (SnO2:Sb) heater elements.
Magnetic nanoparticles (MNP) are investigated with great interest for biomedical applications in diagnostics (e.g. imaging: magnetic particle imaging (MPI)), therapeutics (e.g. hyperthermia: magnetic fluid hyperthermia (MFH)) and multi-purpose biosensing (e.g. magnetic immunoassays (MIA)). What all of these applications have in common is that they are based on the unique magnetic relaxation mechanisms of MNP in an alternating magnetic field (AMF). While MFH and MPI are currently the most prominent examples of biomedical applications, here we present results on the relatively new biosensing application of frequency mixing magnetic detection (FMMD) from a simulation perspective. In general, we ask how the key parameters of MNP (core size and magnetic anisotropy) affect the FMMD signal: by varying the core size, we investigate the effect of the magnetic volume per MNP; and by changing the effective magnetic anisotropy, we study the MNPs’ flexibility to leave its preferred magnetization direction. From this, we predict the most effective combination of MNP core size and magnetic anisotropy for maximum signal generation.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
Humic substances possess distinctive chemical features enabling their use in many advanced applications, including biomedical fields. No chemicals in nature have the same combination of specific chemical and biological properties as humic substances. Traditional medicine and modern research have demonstrated that humic substances from different sources possess immunomodulatory and anti-inflammatory properties, which makes them suitable for the prevention and treatment of chronic dermatoses, allergic rhinitis, atopic dermatitis, and other conditions characterized by inflammatory and allergic responses [1-4]. The use of humic compounds as agentswith antifungal and antiviral properties shows great potential [5-7].
Solar sails are propelled in space by reflecting solar photons off large mirroring surfaces, thereby transforming the momentum of the photons into a propulsive force. This innovative concept for low-thrust space propulsion works without any propellant and thus provides a wide range of opportunities for highenergy low-cost missions. Offering an efficient way of propulsion, solar sailcraft could close a gap in transportation options for highly demanding exploration missions within our solar system and even beyond. On December 17th, 1999, a significant step was made towards the realization of this technology: a lightweight solar sail structure with an area of 20 m × 20 m was successfully deployed on ground in a large facility at the German Aerospace Center (DLR) at Cologne. The deployment from a package of 60 cm × 60 cm × 65 cm with a total mass of less than 35 kg was achieved using four extremely light-weight carbon fiber reinforced plastics (CFRP) booms with a specific mass of 100 g/m. The paper briefly reviews the basic principles of solar sails as well as the technical concept and its realization in the ground demonstration experiment, performed in close cooperation between DLR and ESA. Next possible steps are outlined. They could comprise the in-orbit demonstration of the sail deployment on the upper stage of a low-cost rocket and the verification of the propulsion concept by an autonomous and free flying solar sail in the frame of a scientific mission. It is expected that the present design could be extended to sail sizes of about (40 m)2 up to even (70 m)2 without significant mass penalty. With these areas, the maximum achievable thrust at 1 AU would range between 10 and 40 mN – comparable to some electric thrusters. Such prototype sails with a mass between 50 and 150 kg plus a micro-spacecraft of 50 to 250 kg would have a maximum acceleration in the order of 0.1 mm/s2 at 1 AU, corresponding to a maximum ∆V-capability of about 3 km/s per year. Two near/medium-term mission examples to a near-Earth asteroid (NEA) will be discussed: a rendezvous mission
and a sample return mission.
Recently, in his vision for space exploration, US president Bush announced to extend human presence across the solar system, starting with a human return to the Moon as early as 2015 in preparation for human exploration of Mars and other destinations. In Europe, an exploration program, termed AURORA, was established by ESA in 2001 – funded on a voluntary basis by ESA member states – with a clear focus on Mars and the ultimate goal of landing humans on Mars around 2030 in international cooperation. In 2003, a Human Spaceflight Vision Group was appointed by ESA with the task to develop a vision for the role of human spaceflight during the next quarter of the century. The resulting vision focused on a European-led lunar exploration initiative as part of a multi-decade, international effort to strengthen European identity and economy. After a review of the situation in Europe concerning space exploration, the paper outlines an approach for a consistent positioning of exploration within the existing European space programs, identifies destinations, and develops corresponding scenarios for an integrated strategy, starting with robotic missions to the Moon, Mars, and near-Earth asteroids. The interests of the European planetary in-situ science community, which recently met at DLR Cologne, are considered. Potential robotic lunar missions comprise polar landings to search for frozen volatiles and a sample return. For Mars, the implementation of a modest robotic landing mission in 2009 to demonstrate the capability for landing and prepare more ambitious and complex missions is discussed. For near-Earth asteroid exploration, a low-cost in-situ technology demonstration mission could yield important results. All proposed scenarios offer excellent science and could therefore create synergies between ESA’s mandatory and optional programs in the area of planetary science and exploration. The paper intents to stimulate the European discussion on space exploration and reflects the personal view of the authors.
Proceedings of the 2nd Humboldt Kolleg, Hammamet, Tunisia Organizer: Alexander von Humboldt Stiftung, Germany. pdf 184 p. Welcome Address Dear Participants, Welcome to the 2nd Humboldt Kolleg in “Nanoscale Science and Technology” (NS&T’12) in Tunisia, sponsored by the "Alexander von Humboldt" foundation. The NS&T’12 multidisciplinary scientific program includes seven "hot" topics dealing with "Nanoscale Science and Technology" covering basic and application-oriented research as well as industrial (market) aspects: - Molecular Biophyics, Spectroscopy Techniques, Imaging Microscopy - Nanomaterials Synthesis for Medicine and Bio-chemical Sensors - Nanostructures, Semiconductors, Photonics and Nanodevices - New Technologies in Market Industry - Environment, Electro-chemistry, Bio-polymers and Fuel Cells - Nanomaterials, Photovoltaic, Modelling, Quantum Physics - Microelectronics, Sensors Networks and Embedded Systems We are deeply indebted to all members of the Scientific Committee and General Chairs for joint Sessions and to all speakers and chairmen, who have dedicated invaluable time and efforts for the realization of this event. On behalf of the Organizing Committee, we are cordially inviting you to join the conference and hope that your stay will be fruitful, rewarding and enjoyable. Prof. Dr. Michael J. Schöning, Prof. Dr. Adnane Abdelghani
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".
The understanding that optimized components do not automatically lead to energy-efficient systems sets the attention from the single component on the entire technical system. At TU Darmstadt, a new field of research named Technical Operations Research (TOR) has its origin. It combines mathematical and technical know-how for the optimal design of technical systems. We illustrate our optimization approach in a case study for the design of a ventilation system with the ambition to minimize the energy consumption for a temporal distribution of diverse load demands. By combining scaling laws with our optimization methods we find the optimal combination of fans and show the advantage of the use of multiple fans.
A novel solar sterilization and water destillation system : experiment and thermodynamic analysis
(1991)
ICSs (Industrial Control Systems) and its subset SCADA systems (Supervisory Control and Data Acquisition) are getting exposed to a constant stream of new threats. The increasing importance of IT security in ICS requires viable methods to assess the security of ICS, its individual components, and its protocols. This paper presents a security analysis with focus on the communication protocols of a single PLC (Programmable Logic Controller). The PLC, a Beckhoff CX2020, is examined and new vulnerabilities of the system are revealed. Based on these findings recommendations are made to improve security of the Beckhoff system and its protocols.
Improved efficiency prediction of a molten salt receiver based on dynamic cloud passage simulation
(2019)
Despite the challenges of pioneering molten salt towers (MST), it remains the leading technology in central receiver power plants today, thanks to cost effective storage integration and high cost reduction potential. The limited controllability in volatile solar conditions can cause significant losses, which are difficult to estimate without comprehensive modeling [1]. This paper presents a Methodology to generate predictions of the dynamic behavior of the receiver system as part of an operating assistance system (OAS). Based on this, it delivers proposals if and when to drain and refill the receiver during a cloudy period in order maximize the net yield and quantifies the amount of net electricity gained by this. After prior analysis with a detailed dynamic two-phase model of the entire receiver system, two different reduced modeling approaches where developed and implemented in the OAS. A tailored decision algorithm utilizes both models to deliver the desired predictions efficiently and with appropriate accuracy.
Concerning current efforts to improve operational efficiency and to lower overall costs of concentrating solar power (CSP) plants with prediction-based algorithms, this study investigates the quality and uncertainty of nowcasting data regarding the implications for process predictions. DNI (direct normal irradiation) maps from an all-sky imager-based nowcasting system are applied to a dynamic prediction model coupled with ray tracing. The results underline the need for high-resolution DNI maps in order to predict net yield and receiver outlet temperature realistically. Furthermore, based on a statistical uncertainty analysis, a correlation is developed, which allows for predicting the uncertainty of the net power prediction based on the corresponding DNI forecast uncertainty. However, the study reveals significant prediction errors and the demand for further improvement in the accuracy at which local shadings are forecasted.
The worldwide Corona pandemic has severely restricted student projects in the higher semesters of engineering courses. In order not to delay the graduation, a new concept had to be developed for projects under lockdown conditions. Therefore, unused rooms at the university should be digitally recorded in order to develop a new usage concept as laboratory rooms. An inventory of the actual state of the rooms was done first by taking photos and listing up all flaws and peculiarities. After that, a digital site measuring was done with a 360° laser scanner and these recorded scans were linked to a coherent point cloud and transferred to a software for planning technical building services and supporting Building Information Modelling (BIM). In order to better illustrate the difference between the actual and target state, two virtual reality models were created for realistic demonstration. During the project, the students had to go through the entire digital planning phases. Technical specifications had to be complied with, as well as documentation, time planning and cost estimate. This project turned out to be an excellent alternative to on-site practical training under lockdown conditions and increased the students’ motivation to deal with complex technical questions.
In the context of the Corona pandemic and its impact on teaching like digital lectures and exercises a new concept especially for freshmen in demanding courses of Smart Building Engineering became necessary. As there were hardly any face-to-face events at the university, the new teaching concept should enable a good start into engineering studies under pandemic conditions anyway and should also replace the written exam at the end. The students should become active themselves in small teams instead of listening passively to a lecture broadcast online with almost no personal contact. For this purpose, a role play was developed in which the freshmen had to work out a complete solution to the realistic problem of designing, construction planning and implementing a small guesthouse. Each student of the team had to take a certain role like architect, site manager, BIM-manager, electrician and the technitian for HVAC installations. Technical specifications must be complied with, as well as documentation, time planning and cost estimate. The final project folder had to contain technical documents like circuit diagrams for electrical components, circuit diagrams for water and heating, design calculations and components lists. On the other hand construction schedule, construction implementation plan, documentation of the construction progress and minutes of meetings between the various trades had to be submitted as well. In addition to the project folder, a model of the construction project must also be created either as a handmade model or as a digital 3D-model using Computer-aided design (CAD) software. The first steps in the field of Building information modelling (BIM) had also been taken by creating a digital model of the building showing the current planning status in real time as a digital twin. This project turned out to be an excellent training of important student competencies like teamwork, communication skills, and self -organisation and also increased motivation to work on complex technical questions. The aim of giving the student a first impression on the challenges and solutions in building projects with many different technical trades and their points of view was very well achieved and should be continued in the future.
Urban farming is an innovative and sustainable way of food production and is becoming more and more important in smart city and quarter concepts. It also enables the production of certain foods in places where they usually dare not produced, such as production of fish or shrimps in large cities far away from the coast. Unfortunately, it is not always possible to show students such concepts and systems in real life as part of courses: visits of such industry plants are sometimes not possible because of distance or are permitted by the operator for hygienic reasons. In order to give the students the opportunity of getting into contact with such an urban farming system and its complex operation, an industrial urban farming plant was set up on a significantly smaller scale. Therefore, all needed technical components like water aeriation, biological and mechanical filtration or water circulation have been replaced either by aquarium components or by self-designed parts also using a 3D-printer. Students from different courses like mechanical engineering, smart building engineering, biology, electrical engineering, automation technology and civil engineering were involved in this project. This “miniature industrial plant” was also able to start operation and has now been running for two years successfully. Due to Corona pandemic, home office and remote online lectures, the automation of this miniature plant should be brought to a higher level in future for providing a good control over the system and water quality remotely. The aim of giving the student a chance to get to know the operation of an urban farming plant was very well achieved and the students had lots of fun in “playing” and learning with it in a realistic way.
Modern implementations of driver assistance systems are evolving from a pure driver assistance to a independently acting automation system. Still these systems are not covering the full vehicle usage range, also called operational design domain, which require the human driver as fall-back mechanism. Transition of control and potential minimum risk manoeuvres are currently research topics and will bridge the gap until full autonomous vehicles are available. The authors showed in a demonstration that the transition of control mechanisms can be further improved by usage of communication technology. Receiving the incident type and position information by usage of standardised vehicle to everything (V2X) messages can improve the driver safety and comfort level. The connected and automated vehicle’s software framework can take this information to plan areas where the driver should take back control by initiating a transition of control which can be followed by a minimum risk manoeuvre in case of an unresponsive driver. This transition of control has been implemented in a test vehicle and was presented to the public during the IEEE IV2022 (IEEE Intelligent Vehicle Symposium) in Aachen, Germany.
Bitcoin is a cryptocurrency and is considered a high-risk asset class whose price changes are difficult to predict. Current research focusses on daily price movements with a limited number of predictors. The paper at hand aims at identifying measurable indicators for Bitcoin price movements and the development of a suitable forecasting model for hourly changes. The paper provides three research contributions. First, a set of significant indicators for predicting the Bitcoin price is identified. Second, the results of a trained Long Short-term Memory (LSTM) neural network that predicts price changes on an hourly basis is presented and compared with other algorithms. Third, the results foster discussions of the applicability of neural nets for stock price predictions. In total, 47 input features for a period of over 10 months could be retrieved to train a neural net that predicts the Bitcoin price movements with an error rate of 3.52 %.
The integration of high temperature thermal energy storages into existing conventional power plants can help to reduce the CO2 emissions of those plants and lead to lower capital expenditures for building energy storage systems, due to the use of synergy effects [1]. One possibility to implement that, is a molten salt storage system with a powerful power-to-heat unit. This paper presents two possible control concepts for the startup of the charging system of such a facility. The procedures are implemented in a detailed dynamic process model. The performance and safety regarding the film temperatures at heat transmitting surfaces are investigated in the process simulations. To improve the accuracy in predicting the film temperatures, CFD simulations of the electrical heater are carried out and the results are merged with the dynamic model. The results show that both investigated control concepts are safe regarding the temperature limits. The gradient controlled startup performed better than the temperature-controlled startup. Nevertheless, there are several uncertainties that need to be investigated further.
A promising approach to reduce the system costs of molten salt solar receivers is to enable the irradiation of the absorber tubes on both sides. The star design is an innovative receiver design, pursuing this approach. The unconventional design leads to new challenges in controlling the system. This paper presents a control concept for a molten salt receiver system in star design. The control parameters are optimized in a defined test cycle by minimizing a cost function. The control concept is tested in realistic cloud passage scenarios based on real weather data. During these tests, the control system showed no sign of unstable behavior, but to perform sufficiently in every scenario further research and development like integrating Model Predictive Controls (MPCs) need to be done. The presented concept is a starting point to do so.
Adaptive logistics : information management for planning and control of small series assembly
(2007)
Cybersecurity of Industrial Control Systems (ICS) is an important issue, as ICS incidents may have a direct impact on safety of people or the environment. At the same time the awareness and knowledge about cybersecurity, particularly in the context of ICS, is alarmingly low. Industrial honeypots offer a cheap and easy to implement way to raise cybersecurity awareness and to educate ICS staff about typical attack patterns. When integrated in a productive network, industrial honeypots may not only reveal attackers early but may also distract them from the actual important systems of the network. Implementing multiple honeypots as a honeynet, the systems can be used to emulate or simulate a whole Industrial Control System. This paper describes a network of honeypots emulating HTTP, SNMP, S7communication and the Modbus protocol using Conpot, IMUNES and SNAP7. The nodes mimic SIMATIC S7 programmable logic controllers (PLCs) which are widely used across the globe. The deployed honeypots' features will be compared with the features of real SIMATIC S7 PLCs. Furthermore, the honeynet has been made publicly available for ten days and occurring cyberattacks have been analyzed
This paper presents NLP Lean Programming
framework (NLPf), a new framework
for creating custom natural language processing
(NLP) models and pipelines by utilizing
common software development build systems.
This approach allows developers to train and
integrate domain-specific NLP pipelines into
their applications seamlessly. Additionally,
NLPf provides an annotation tool which improves
the annotation process significantly by
providing a well-designed GUI and sophisticated
way of using input devices. Due to
NLPf’s properties developers and domain experts
are able to build domain-specific NLP
applications more efficiently. NLPf is Opensource
software and available at https://
gitlab.com/schrieveslaach/NLPf.
Research collaborations provide opportunities for both practitioners and researchers: practitioners need solutions for difficult business challenges and researchers are looking for hard problems to solve and publish. Nevertheless, research collaborations carry the risk that practitioners focus on quick solutions too much and that researchers tackle theoretical problems, resulting in products which do not fulfill the project requirements.
In this paper we introduce an approach extending the ideas of agile and lean software development. It helps practitioners and researchers keep track of their common research collaboration goal: a scientifically enriched software product which fulfills the needs of the practitioner’s business model.
This approach gives first-class status to application-oriented metrics that measure progress and success of a research collaboration continuously. Those metrics are derived from the collaboration requirements and help to focus on a commonly defined goal.
An appropriate tool set evaluates and visualizes those metrics with minimal effort, and all participants will be pushed to focus on their tasks with appropriate effort. Thus project status, challenges and progress are transparent to all research collaboration members at any time.
With the increased interest for interstellar exploration after the discovery of exoplanets and the proposal by Breakthrough Starshot, this paper investigates the optimisation of photon-sail trajectories in Alpha Centauri. The prime objective is to find the optimal steering strategy for a photonic sail to get captured around one of the stars after a minimum-time transfer from Earth. By extending the idea of the Breakthrough Starshot project with a deceleration phase upon arrival, the mission’s scientific yield will be increased. As a secondary objective, transfer trajectories between the stars and orbit-raising manoeuvres to explore the habitable zones of the stars are investigated. All trajectories are optimised for minimum time of flight using the trajectory optimisation software InTrance. Depending on the sail technology, interstellar travel times of 77.6-18,790 years can be achieved, which presents an average improvement of 30% with respect to previous work. Still, significant technological development is required to reach and be captured in the Alpha-Centauri system in less than a century. Therefore, a fly-through mission arguably remains the only option for a first exploratory mission to Alpha Centauri, but the enticing results obtained in this work provide perspective for future long-residence missions to our closest neighbouring star system.
Quantitative evaluation of health management designs for fuel cell systems in transport vehicles
(2022)
Focusing on transport vehicles, mainly with regard to aviation applications, this paper presents compilation and subsequent quantitative evaluation of methods aimed at building an optimum integrated health management solution for fuel cell systems. The methods are divided into two different main types and compiled in a related scheme. Furthermore, different methods are analysed and evaluated based on parameters specific to the aviation context of this study. Finally, the most suitable method for use in fuel cell health management systems is identified and its performance and suitability is quantified.
In addition to electromobility and alternative drive systems, a focus is set on electrically driven compressors (EDC), with a high potential for increasing the efficiency of internal combustion engines (ICE) and fuel cells [01]. The primary objective is to increase the ICE torque, provided independently of the ICE speed by compressing the intake air and consequently the ICE filling level supported by the compressor. For operation independent from the ICE speed, the EDC compressor is decoupled from the turbine by using an electric compressor motor (CM) instead of the turbine. ICE performances can be increased by the use of EDC where individual compressor parameters are adapted to the respective application area [02] [03]. This task contains great challenges, increased by demands with regard to pollutant reduction while maintaining constant performance and reduced fuel consumption. The FH-Aachen is equipped with an EDC test bench which enables EDC-investigations in various configurations and operating modes. Characteristic properties of different compressors can be determined, which build the basis for a comparison methodology. Subject of this project is the development of a comparison methodology for EDC with an associated evaluation method and a defined overall evaluation method. For the application of this comparison methodology, corresponding series of measurements are carried out on the EDC test bench using an appropriate test device.
MedicVR : Acceleration and Enhancement Techniques for Direct Volume Rendering in Virtual Reality
(2019)
Pulmonary arterial cannulation is a common and effective method for percutaneous mechanical circulatory support for concurrent right heart and respiratory failure [1]. However, limited data exists to what effect the positioning of the cannula has on the oxygen perfusion throughout the pulmonary artery (PA). This study aims to evaluate, using computational fluid dynamics (CFD), the effect of different cannula positions in the PA with respect to the oxygenation of the different branching vessels in order for an optimal cannula position to be determined. The four chosen different positions (see Fig. 1) of the cannulas are, in the lower part of the main pulmonary artery (MPA), in the MPA at the junction between the right pulmonary artery (RPA) and the left pulmonary artery (LPA), in the RPA at the first branch of the RPA and in the LPA at the first branch of the LPA.
Seismic design of buried pipeline systems for energy and water supply is not only important for plant and operational safety but also for the maintenance of the supply infrastructure after an earthquake. The present paper shows special issues of the seismic wave impacts on buried pipelines, describes calculation methods, proposes approaches and gives calculation examples. This paper regards the effects of transient displacement differences and resulting tensions within the pipeline due to the wave propagation of the earthquake. However, the presented model can also be used to calculate fault rupture induced displacements. Based on a three-dimensional Finite Element Model parameter studies are performed to show the influence of several parameters such as incoming wave angle, wave velocity, backfill height and synthetic displacement time histories. The interaction between the pipeline and the surrounding soil is modeled with non-linear soil springs and the propagating wave is simulated affecting the pipeline punctually, independently in time and space. Special attention is given to long-distance heat pipeline systems. Here, in regular distances expansion bends are arranged to ensure movements of the pipeline due to high temperature. Such expansion bends are usually designed with small bending radii, which during the earthquake lead to high bending stresses in the cross-section of the pipeline. Finally, an interpretation of the results and recommendations are given for the most critical parameters.
The integration of product data from heterogeneous sources and manufacturers into a single catalog is often still a laborious, manual task. Especially small- and medium-sized enterprises face the challenge of timely integrating the data their business relies on to have an up-to-date product catalog, due to format specifications, low quality of data and the requirement of expert knowledge. Additionally, modern approaches to simplify catalog integration demand experience in machine learning, word vectorization, or semantic similarity that such enterprises do not have. Furthermore, most approaches struggle with low-quality data. We propose Attribute Label Ranking (ALR), an easy to understand and simple to adapt learning approach. ALR leverages a model trained on real-world integration data to identify the best possible schema mapping of previously unknown, proprietary, tabular format into a standardized catalog schema. Our approach predicts multiple labels for every attribute of an inpu t column. The whole column is taken into consideration to rank among these labels. We evaluate ALR regarding the correctness of predictions and compare the results on real-world data to state-of-the-art approaches. Additionally, we report findings during experiments and limitations of our approach.
The integration of frequently changing, volatile product data from different manufacturers into a single catalog is a significant challenge for small and medium-sized e-commerce companies. They rely on timely integrating product data to present them aggregated in an online shop without knowing format specifications, concept understanding of manufacturers, and data quality. Furthermore, format, concepts, and data quality may change at any time. Consequently, integrating product catalogs into a single standardized catalog is often a laborious manual task. Current strategies to streamline or automate catalog integration use techniques based on machine learning, word vectorization, or semantic similarity. However, most approaches struggle with low-quality or real-world data. We propose Attribute Label Ranking (ALR) as a recommendation engine to simplify the integration process of previously unknown, proprietary tabular format into a standardized catalog for practitioners. We evaluate ALR by focusing on the impact of different neural network architectures, language features, and semantic similarity. Additionally, we consider metrics for industrial application and present the impact of ALR in production and its limitations.
Often, research results from collaboration projects are not transferred into productive environments even though approaches are proven to work in demonstration prototypes. These demonstration prototypes are usually too fragile and error-prone to be transferred
easily into productive environments. A lot of additional work is required.
Inspired by the idea of an incremental delivery process, we introduce an architecture pattern, which combines the approach of Metrics Driven Research Collaboration with microservices for the ease of integration. It enables keeping track of project goals over the course of the collaboration while every party may focus on their expert skills: researchers may focus on complex algorithms,
practitioners may focus on their business goals.
Through the simplified integration (intermediate) research results can be introduced into a productive environment which enables
getting an early user feedback and allows for the early evaluation of different approaches. The practitioners’ business model benefits throughout the full project duration.
This paper presents the laser-based powder bed fusion (L-PBF) using various glass powders (borosilicate and quartz glass). Compared to metals, these require adapted process strategies. First, the glass powders were characterized with regard to their material properties and their processability in the powder bed. This was followed by investigations of the melting behavior of the glass powders with different laser wavelengths (10.6 µm, 1070 nm). In particular, the experimental setup of a CO2 laser was adapted for the processing of glass powder. An experimental setup with integrated coaxial temperature measurement/control and an inductively heatable build platform was created. This allowed the L-PBF process to be carried out at the transformation temperature of the glasses. Furthermore, the component’s material quality was analyzed on three-dimensional test specimen with regard to porosity, roughness, density and geometrical accuracy in order to evaluate the developed L-PBF parameters and to open up possible applications.