Refine
Year of publication
Document Type
- Article (3149)
- Conference Proceeding (1016)
- Part of a Book (184)
- Book (144)
- Doctoral Thesis (30)
- Patent (25)
- Other (9)
- Report (9)
- Preprint (4)
- Poster (3)
- Talk (3)
- Master's Thesis (2)
- Working Paper (2)
- Bachelor Thesis (1)
- Contribution to a Periodical (1)
- Habilitation (1)
Language
- English (4583) (remove)
Has Fulltext
- no (4583) (remove)
Keywords
- Gamification (6)
- avalanche (6)
- Earthquake (5)
- Enterprise Architecture (5)
- MINLP (5)
- solar sail (5)
- Diversity Management (4)
- Energy storage (4)
- Engineering optimization (4)
- LAPS (4)
- Natural language processing (4)
- Papierkunst (4)
- Power plants (4)
- Seismic design (4)
- field-effect sensor (4)
- frequency mixing magnetic detection (4)
- hydrogen (4)
- metal structure (4)
- snow (4)
- steel (4)
Institute
- Fachbereich Medizintechnik und Technomathematik (1545)
- Fachbereich Elektrotechnik und Informationstechnik (686)
- IfB - Institut für Bioengineering (560)
- Fachbereich Energietechnik (552)
- INB - Institut für Nano- und Biotechnologien (532)
- Fachbereich Chemie und Biotechnologie (522)
- Fachbereich Luft- und Raumfahrttechnik (463)
- Fachbereich Maschinenbau und Mechatronik (261)
- Fachbereich Wirtschaftswissenschaften (196)
- Solar-Institut Jülich (160)
- Fachbereich Bauingenieurwesen (146)
- ECSM European Center for Sustainable Mobility (75)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (62)
- Fachbereich Gestaltung (24)
- Nowum-Energy (24)
- Institut fuer Angewandte Polymerchemie (23)
- Sonstiges (21)
- Fachbereich Architektur (20)
- Freshman Institute (18)
- Kommission für Forschung und Entwicklung (18)
Software development projects often fail because of insufficient code quality. It is now well documented that the task of testing software, for example, is perceived as uninteresting and rather boring, leading to poor software quality and major challenges to software development companies. One promising approach to increase the motivation for considering software quality is the use of gamification. Initial research works already investigated the effects of gamification on software developers and come to promising. Nevertheless, a lack of results from field experiments exists, which motivates the chapter at hand. By conducting a gamification experiment with five student software projects and by interviewing the project members, the chapter provides insights into the changing programming behavior of information systems students when confronted with a leaderboard. The results reveal a motivational effect as well as a reduction of code smells.
Autonomous agents require rich environment models for fulfilling their missions. High-definition maps are a well-established map format which allows for representing semantic information besides the usual geometric information of the environment. These are, for instance, road shapes, road markings, traffic signs or barriers. The geometric resolution of HD maps can be as precise as of centimetre level. In this paper, we report on our approach of using HD maps as a map representation for autonomous load-haul-dump vehicles in open-pit mining operations. As the mine undergoes constant change, we also need to constantly update the map. Therefore, we follow a lifelong mapping approach for updating the HD maps based on camera-based object detection and GPS data. We show our mapping algorithm based on the Lanelet 2 map format and show our integration with the navigation stack of the Robot Operating System. We present experimental results on our lifelong mapping approach from a real open-pit mine.
Due to the decarbonization of the energy sector, the electric distribution grids are undergoing a major transformation, which is expected to increase the load on the operating resources due to new electrical loads and distributed energy resources. Therefore, grid operators need to gradually move to active grid management in order to ensure safe and reliable grid operation. However, this requires knowledge of key grid variables, such as node voltages, which is why the mass integration of measurement technology (smart meters) is necessary. Another problem is the fact that a large part of the topology of the distribution grids is not sufficiently digitized and models are partly faulty, which means that active grid operation management today has to be carried out largely blindly. It is therefore part of current research to develop methods for determining unknown grid topologies based on measurement data. In this paper, different clustering algorithms are presented and their performance of topology detection of low voltage grids is compared. Furthermore, the influence of measurement uncertainties is investigated in the form of a sensitivity analysis.
AI-based systems are nearing ubiquity not only in everyday low-stakes activities but also in medical procedures. To protect patients and physicians alike, explainability requirements have been proposed for the operation of AI-based decision support systems (AI-DSS), which adds hurdles to the productive use of AI in clinical contexts. This raises two questions: Who decides these requirements? And how should access to AI-DSS be provided to communities that reject these standards (particularly when such communities are expert-scarce)? This chapter investigates a dilemma that emerges from the implementation of global AI governance. While rejecting global AI governance limits the ability to help communities in need, global AI governance risks undermining and subjecting health-insecure communities to the force of the neo-colonial world order. For this, this chapter first surveys the current landscape of AI governance and introduces the approach of relational egalitarianism as key to (global health) justice. To discuss the two horns of the referred dilemma, the core power imbalances faced by health-insecure collectives (HICs) are examined. The chapter argues that only strong demands of a dual strategy towards health-secure collectives can both remedy the immediate needs of HICs and enable them to become healthcare independent.
Digital forensics of smartphones is of utmost importance in many criminal cases. As modern smartphones store chats, photos, videos etc. that can be relevant for investigations and as they can have storage capacities of hundreds of gigabytes, they are a primary target for forensic investigators. However, it is exactly this large amount of data that is causing problems: extracting and examining the data from multiple phones seized in the context of a case is taking more and more time. This bears the risk of wasting a lot of time with irrelevant phones while there is not enough time left to analyze a phone which is worth examination. Forensic triage can help in this case: Such a triage is a preselection step based on a subset of data and is performed before fully extracting all the data from the smartphone. Triage can accelerate subsequent investigations and is especially useful in cases where time is essential. The aim of this paper is to determine which and how much data from an Android smartphone can be made directly accessible to the forensic investigator – without tedious investigations. For this purpose, an app has been developed that can be used with extremely limited storage of data in the handset and which outputs the extracted data immediately to the forensic workstation in a human- and machine-readable format.
Experimental determination of the cross sections of proton capture on radioactive nuclei is extremely difficult. Therefore, it is of substantial interest for the understanding of the production of the p-nuclei. For the first time, a direct measurement of proton-capture cross sections on stored, radioactive ions became possible in an energy range of interest for nuclear astrophysics. The experiment was performed at the Experimental Storage Ring (ESR) at GSI by making use of a sensitive method to measure (p,γ) and (p,n) reactions in inverse kinematics. These reaction channels are of high relevance for the nucleosyn-thesis processes in supernovae, which are among the most violent explosions in the universe and are not yet well understood. The cross section of the ¹¹⁸Te(p,γ) reaction has been measured at energies of 6 MeV/u and 7 MeV/u. The heavy ions interacted with a hydrogen gas jet target. The radiative recombination process of the fully stripped ¹¹⁸Te ions and electrons from the hydrogen target was used as a luminosity monitor. An overview of the experimental method and preliminary results from the ongoing analysis will be presented.
Due to the increasing complexity of software projects, software development is becoming more and more dependent on teams. The quality of this teamwork can vary depending on the team composition, as teams are always a combination of different skills and personality types. This paper aims to answer the question of how to describe a software development team and what influence the personality of the team members has on the team dynamics. For this purpose, a systematic literature review (n=48) and a literature search with the AI research assistant Elicit (n=20) were conducted. Result: A person’s personality significantly shapes his or her thinking and actions, which in turn influences his or her behavior in software development teams. It has been shown that team performance and satisfaction can be strongly influenced by personality. The quality of communication and the likelihood of conflict can also be attributed to personality.
The RoboCup Logistics League (RCLL) is a robotics competition in a production logistics scenario in the context of a Smart Factory. In the competition, a team of three robots needs to assemble products to fulfill various orders that are requested online during the game. This year, the Carologistics team was able to win the competition with a new approach to multi-agent coordination as well as significant changes to the robot’s perception unit and a pragmatic network setup using the cellular network instead of WiFi. In this paper, we describe the major components of our approach with a focus on the changes compared to the last physical competition in 2019.
Modern implementations of driver assistance systems are evolving from a pure driver assistance to a independently acting automation system. Still these systems are not covering the full vehicle usage range, also called operational design domain, which require the human driver as fall-back mechanism. Transition of control and potential minimum risk manoeuvres are currently research topics and will bridge the gap until full autonomous vehicles are available. The authors showed in a demonstration that the transition of control mechanisms can be further improved by usage of communication technology. Receiving the incident type and position information by usage of standardised vehicle to everything (V2X) messages can improve the driver safety and comfort level. The connected and automated vehicle’s software framework can take this information to plan areas where the driver should take back control by initiating a transition of control which can be followed by a minimum risk manoeuvre in case of an unresponsive driver. This transition of control has been implemented in a test vehicle and was presented to the public during the IEEE IV2022 (IEEE Intelligent Vehicle Symposium) in Aachen, Germany.
Lead and nickel, as heavy metals, are still used in industrial processes, and are classified as “environmental health hazards” due to their toxicity and polluting potential. The detection of heavy metals can prevent environmental pollution at toxic levels that are critical to human health. In this sense, the electrolyte–insulator–semiconductor (EIS) field-effect sensor is an attractive sensing platform concerning the fabrication of reusable and robust sensors to detect such substances. This study is aimed to fabricate a sensing unit on an EIS device based on Sn₃O₄ nanobelts embedded in a polyelectrolyte matrix of polyvinylpyrrolidone (PVP) and polyacrylic acid (PAA) using the layer-by-layer (LbL) technique. The EIS-Sn₃O₄ sensor exhibited enhanced electrochemical performance for detecting Pb²⁺ and Ni²⁺ ions, revealing a higher affinity for Pb²⁺ ions, with sensitivities of ca. 25.8 mV/decade and 2.4 mV/decade, respectively. Such results indicate that Sn₃O₄ nanobelts can contemplate a feasible proof-of-concept capacitive field-effect sensor for heavy metal detection, envisaging other future studies focusing on environmental monitoring.
Ice melting probes
(2023)
The exploration of icy environments in the solar system, such as the poles of Mars and the icy moons (a.k.a. ocean worlds), is a key aspect for understanding their astrobiological potential as well as for extraterrestrial resource inspection. On these worlds, ice melting probes are considered to be well suited for the robotic clean execution of such missions. In this chapter, we describe ice melting probes and their applications, the physics of ice melting and how the melting behavior can be modeled and simulated numerically, the challenges for ice melting, and the required key technologies to deal with those challenges. We also give an overview of existing ice melting probes and report some results and lessons learned from laboratory and field tests.
Even the shortest flight through unknown, cluttered environments requires reliable local path planning algorithms to avoid unforeseen obstacles. The algorithm must evaluate alternative flight paths and identify the best path if an obstacle blocks its way. Commonly, weighted sums are used here. This work shows that weighted Chebyshev distances and factorial achievement scalarising functions are suitable alternatives to weighted sums if combined with the 3DVFH* local path planning algorithm. Both methods considerably reduce the failure probability of simulated flights in various environments. The standard 3DVFH* uses a weighted sum and has a failure probability of 50% in the test environments. A factorial achievement scalarising function, which minimises the worst combination of two out of four objective functions, reaches a failure probability of 26%; A weighted Chebyshev distance, which optimises the worst objective, has a failure probability of 30%. These results show promise for further enhancements and to support broader applicability.
KNX is a protocol for smart building automation, e.g., for automated heating, air conditioning, or lighting. This paper analyses and evaluates state-of-the-art KNX devices from manufacturers Merten, Gira and Siemens with respect to security. On the one hand, it is investigated if publicly known vulnerabilities like insecure storage of passwords in software, unencrypted communication, or denialof-service attacks, can be reproduced in new devices. On the other hand, the security is analyzed in general, leading to the discovery of a previously unknown and high risk vulnerability related to so-called BCU (authentication) keys.
Selected problems in the field of multivariate statistical analysis are treated. Thereby, one focus is on the paired sample case. Among other things, statistical testing problems of marginal homogeneity are under consideration. In detail, properties of Hotelling‘s T² test in a special parametric situation are obtained. Moreover, the nonparametric problem of marginal homogeneity is discussed on the basis of possibly incomplete data. In the bivariate data case, properties of the Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic on the basis of partly not identically distributed data are investigated. Similar testing problems are treated within the scope of the application of a result for the empirical process of the concomitants for partly categorial data. Furthermore, testing changes in the modeled solvency capital requirement of an insurance company by means of a paired sample from an internal risk model is discussed. Beyond the paired sample case, a new asymptotic relative efficiency concept based on the expected volumes of multidimensional confidence regions is introduced. Besides, a new approach for the treatment of the multi-sample goodness-of-fit problem is presented. Finally, a consistent test for the treatment of the goodness-of-fit problem is developed for the background of huge or infinite dimensional data.
The first and last mile of a railway journey, in both freight and transit applications, constitutes a high effort and is either non-productive (e.g. in the case of depot operations) or highly inefficient (e.g. in industrial railways). These parts are typically managed on-sight, i.e. with no signalling and train protection systems ensuring the freedom of movement. This is possible due to the rather short braking distances of individual vehicles and shunting consists. The present article analyses the braking behaviour of such shunting units. For this purpose, a dedicated model is developed. It is calibrated on published results of brake tests and validated against a high-definition model for low-speed applications. Based on this model, multiple simulations are executed to obtain a Monte Carlo simulation of the resulting braking distances. Based on the distribution properties and established safety levels, the risk of exceeding certain braking distances is evaluated and maximum braking distances are derived. Together with certain parameters of the system, these can serve in the design and safety assessment of driver assistance systems and automation of these processes.
Clinical assessment of newly developed sensors is important for ensuring their validity. Comparing recordings of emerging electrocardiography (ECG) systems to a reference ECG system requires accurate synchronization of data from both devices. Current methods can be inefficient and prone to errors. To address this issue, three algorithms are presented to synchronize two ECG time series from different recording systems: Binned R-peak Correlation, R-R Interval Correlation, and Average R-peak Distance. These algorithms reduce ECG data to their cyclic features, mitigating inefficiencies and minimizing discrepancies between different recording systems. We evaluate the performance of these algorithms using high-quality data and then assess their robustness after manipulating the R-peaks. Our results show that R-R Interval Correlation was the most efficient, whereas the Average R-peak Distance and Binned R-peak Correlation were more robust against noisy data.
This paper introduces an inexpensive Wiegand-sensor-based rotary encoder that avoids rotating magnets and is suitable for electrical-drive applications. So far, Wiegand-sensor-based encoders usually include a magnetic pole wheel with rotating permanent magnets. These encoders combine the disadvantages of an increased magnet demand and a limited maximal speed due to the centripetal force acting on the rotating magnets. The proposed approach reduces the total demand of permanent magnets drastically. Moreover, the rotating part is manufacturable from a single piece of steel, which makes it very robust and cheap. This work presents the theoretical operating principle of the proposed approach and validates its benefits on a hardware prototype. The presented proof-of-concept prototype achieves a mechanical resolution of 4.5 ° by using only 4 permanent magnets, 2Wiegand sensors and a rotating steel gear wheel with 20 teeth.
Traditional vulcanization mold manufacturing is complex, costly, and under pressure due to shorter product lifecycles and diverse variations. Additive manufacturing using Fused Filament Fabrication and high-performance polymers like PEEK offer a promising future in this industry. This study assesses the compressive strength of various infill structures (honeycomb, grid, triangle, cubic, and gyroid) when considering two distinct build directions (Z, XY) to enhance PEEK’s economic and resource efficiency in rapid tooling. A comparison with PETG samples shows the behavior of the infill strategies. Additionally, a proof of concept illustrates the application of a PEEK mold in vulcanization. A peak compressive strength of 135.6 MPa was attained in specimens that were 100% solid and subjected to thermal post-treatment. This corresponds to a 20% strength improvement in the Z direction. In terms of time and mechanical properties, the anisotropic grid and isotropic cubic infill have emerged for use in rapid tooling. Furthermore, the study highlights that reducing the layer thickness from 0.15 mm to 0.1 mm can result in a 15% strength increase. The study unveils the successful utilization of a room-temperature FFF-printed PEEK mold in vulcanization injection molding. The parameters and infill strategies identified in this research enable the resource-efficient FFF printing of PEEK without compromising its strength properties. Using PEEK in rapid tooling allows a cost reduction of up to 70% in tool production.
Manufacturing companies across multiple industries face an increasingly dynamic and unpredictable environment. This development can be seen on both the market and supply side. To respond to these challenges, manufacturing companies must implement smart manufacturing systems and become more flexible and agile. The flexibility in operational planning regarding the scheduling and sequencing of customer orders needs to be increased and new structures must be implemented in manufacturing systems’ fundamental design as they constitute much of the operational flexibility available. To this end, smart and more flexible solutions for production planning and control (PPC) are developed. However, scheduling or sequencing is often only considered isolated in a predefined stable environment. Moreover, their orientation on the fundamental logic of the existing IT solutions and their applicability in a dynamic environment is limited. This paper presents a conceptual model for a task-based description logic that can be applied to factory planning, technology planning, and operational control. By using service-oriented architectures, the goal is to generate smart manufacturing systems. The logic is designed to allow for easy and automated maintenance. It is compatible with the existing resource and process allocation logic across operational and strategic factory and production planning.
Melting probes are a proven tool for the exploration of thick ice layers and clean sampling of subglacial water on Earth. Their compact size and ease of operation also make them a key technology for the future exploration of icy moons in our Solar System, most prominently Europa and Enceladus. For both mission planning and hardware engineering, metrics such as efficiency and expected performance in terms of achievable speed, power requirements, and necessary heating power have to be known.
Theoretical studies aim at describing thermal losses on the one hand, while laboratory experiments and field tests allow an empirical investigation of the true performance on the other hand. To investigate the practical value of a performance model for the operational performance in extraterrestrial environments, we first contrast measured data from terrestrial field tests on temperate and polythermal glaciers with results from basic heat loss models and a melt trajectory model. For this purpose, we propose conventions for the determination of two different efficiencies that can be applied to both measured data and models. One definition of efficiency is related to the melting head only, while the other definition considers the melting probe as a whole. We also present methods to combine several sources of heat loss for probes with a circular cross-section, and to translate the geometry of probes with a non-circular cross-section to analyse them in the same way. The models were selected in a way that minimizes the need to make assumptions about unknown parameters of the probe or the ice environment.
The results indicate that currently used models do not yet reliably reproduce the performance of a probe under realistic conditions. Melting velocities and efficiencies are constantly overestimated by 15 to 50 % in the models, but qualitatively agree with the field test data. Hence, losses are observed, that are not yet covered and quantified by the available loss models. We find that the deviation increases with decreasing ice temperature. We suspect that this mismatch is mainly due to the too restrictive idealization of the probe model and the fact that the probe was not operated in an efficiency-optimized manner during the field tests. With respect to space mission engineering, we find that performance and efficiency models must be used with caution in unknown ice environments, as various ice parameters have a significant effect on the melting process. Some of these are difficult to estimate from afar.
Achieving the 17 Sustainable Development Goals (SDGs) set by the United Nations (UN) in 2015 requires global collaboration between different stakeholders. Industry, and in particular engineers who shape industrial developments, have a special role to play as they are confronted with the responsibility to holistically reflect sustainability in industrial processes. This means that, in addition to the technical specifications, engineers must also question the effects of their own actions on an ecological, economic and social level in order to ensure sustainable action and contribute to the achievement of the SDGs. However, this requires competencies that enable engineers to apply all three pillars of sustainability to their own field of activity and to understand the global impact of industrial processes. In this context, it is relevant to understand how industry already reflects sustainability and to identify competences needed for sustainable development.
In addition to the technical content, modern courses at university should also teach professional skills to enhance the competencies of students towards their future work. The competency driven approach including technical as well as professional skills makes it necessary to find a suitable way for the integration into the corresponding module in a scalable and flexible manner. Agile development, for example, is essential for the development of modern systems and applications and makes use of dedicated professional skills of the team members, like structured group dynamics and communication, to enable the fast and reliable development. This paper presents an easy to integrate and flexible approach to integrate Scrum, an agile development method, into the lab of an existing module. Due to the different role models of Scrum the students have an individual learning success, gain valuable insight into modern system development and strengthen their communication and organization skills. The approach is implemented and evaluated in the module Vehicle Systems, but it can be transferred easily to other technical courses as well. The evaluation of the implementation considers feedback of all stakeholders, students, supervisor and lecturers, and monitors the observations during project lifetime.
Throughout the last decade, and particularly in 2022, water scarcity has become a critical concern in Morocco and other Mediterranean countries. The lack of rainfall during spring was worsened by a succession of heat waves during the summer. To address this drought, innovative solutions, including the use of new technologies such as hydrogels, will be essential to transform agriculture. This paper presents the findings of a study that evaluated the impact of hydrogel application on onion (Allium cepa) cultivation in Meknes, Morocco. The treatments investigated in this study comprised two different types of hydrogel-based soil additives (Arbovit® polyacrylate and Huminsorb® polyacrylate), applied at two rates (30 and 20 kg/ha), and irrigated at two levels of water supply (100% and 50% of daily crop evapotranspiration; ETc). Two control treatments were included, without hydrogel application and with both water amounts. The experiment was conducted in an open field using a completely randomized design. The results indicated a significant impact of both hydrogel-type dose and water dose on onion plant growth, as evidenced by various vegetation parameters. Among the hydrogels tested, Huminsorb® Polyacrylate produced the most favorable outcomes, with treatment T9 (100%, HP, 30 kg/ha) yielding 70.55 t/ha; this represented an increase of 11 t/ha as compared to the 100% ETc treatment without hydrogel application. Moreover, the combination of hydrogel application with 50% ETc water stress showed promising results, with treatment T4 (HP, 30 kg, 50%) producing almost the same yield as the 100% ETc treatment without hydrogel while saving 208 mm of water.
Antibias training is increasingly demanded and practiced in academia and industry to increase employees’ sensitivity to discrimination, racism, and diversity. Under the heading of “Diversity Management,” antibias trainings are mainly offered as one-off workshops intending to raise awareness of unconscious biases, create a diversity-affirming corporate culture, promote awareness of the potential of
diversity, and ultimately enable the reflection of diversity in development processes. However, coming from childhood education, research and scientific articles on the sustainable effectiveness of antibias in adulthood, especially in academia, are very scarce. In order to fill this research gap, the article aims to explore how sustainable the effects of individual antibias trainings on participants’ behavior are. In order to investigate this, participant observation in a qualitative pre–post setting was conducted, analyzing antibias training in an academic context. Two observers actively participated in the training sessions and documented the activities and reflection processes of the participants. Overall, the results question the effectiveness of single antibias trainings and show that a target-group adaptive approach is mandatory owing to the background of the approach in early childhood education. Therefore, antibias work needs to be adapted to the target group’s needs and realities of life. Furthermore, the study reveals that single antibias trainings must be embedded in a holistic diversity management approach to stimulate sustainable reflection processes among the target group. This article is one of the first to scientifically evaluate antibias training effectiveness, especially in engineering sciences and the university context.
In times of social climate protection movements, such as Fridays for Future, the priorities of society, industry and higher education are currently changing. The consideration of sustainability challenges is increasing. In the context of sustainable development, social skills are crucial to achieving the United Nations Sustainable Development Goals (SDGs). In particular, the impact that educational activities have on people, communities and society is therefore coming to the fore. Research has shown that people with high levels of social competence are better able to manage stressful situations, maintain positive relationships and communicate effectively. They are also associated with better academic performance and career success. However, especially in engineering programs, the social pillar is underrepresented compared to the environmental and economic pillars.
In response to these changes, higher education institutions should be more aware of their social impact - from individual forms of teaching to entire modules and degree programs. To specifically determine the potential for improvement and derive resulting change for further development, we present an initial framework for social impact measurement by transferring already established approaches from the business sector to the education sector. To demonstrate the applicability, we measure the key competencies taught in undergraduate engineering programs in Germany.
The aim is to prepare the students for success in the modern world of work and their future contribution to sustainable development. Additionally, the university can include the results in its sustainability report. Our method can be applied to different teaching methods and enables their comparison.
This book is based on a multimedia course for biological and chemical engineers, which is designed to trigger students' curiosity and initiative. A solid basic knowledge of thermodynamics and kinetics is necessary for understanding many technical, chemical, and biological processes.
The one-semester basic lecture course was divided into 12 workshops (chapters). Each chapter covers a practically relevant area of physical chemistry and contains the following didactic elements that make this book particularly exciting and understandable:
- Links to Videos at the start of each chapter as preparation for the workshop
- Key terms (in bold) for further research of your own
- Comprehension questions and calculation exercises with solutions as learning checks
- Key illustrations as simple, easy-to-replicate blackboard pictures
Humorous cartoons for each workshop (by Faelis) additionally lighten up the text and facilitate the learning process as a mnemonic. To round out the book, the appendix includes a summary of the most popular experiments in basic physical chemistry courses, as well as suggestions for designing workshops with exhibits, experiments, and "questions of the day."
Suitable for students minoring in chemistry; chemistry majors are sure to find this slimmed-down, didactically valuable book helpful as well. The book is excellent for self-study.
Existing residential buildings have an average lifetime of 100 years. Many of these buildings will exist for at least another 50 years. To increase the efficiency of these buildings while keeping costs at reasonable rates, they can be retrofitted with sensors that deliver information to central control units for heating, ventilation and electricity. This retrofitting process should happen with minimal intervention into existing infrastructure and requires new approaches for sensor design and data transmission. At FH Aachen University of Applied Sciences, students of different disciplines work together to learn how to design, build, deploy and operate such sensors. The presented teaching project already created a low power design for a combined CO2, temperature and humidity measurement device that can be easily integrated into most home automation systems
This paper presents an approach for reducing the cognitive load for humans working in quality control (QC) for production processes that adhere to the 6σ -methodology. While 100% QC requires every part to be inspected, this task can be reduced when a human-in-the-loop QC process gets supported by an anomaly detection system that only presents those parts for manual inspection that have a significant likelihood of being defective. This approach shows good results when applied to image-based QC for metal textile products.
Assistance systems have been widely adopted in the manufacturing sector to facilitate various processes and tasks in production environments. However, existing systems are mostly equipped with rigid functional logic and do not provide individual user experiences or adapt to their capabilities. This work integrates human factors in assistance systems by adjusting the hardware and instruction presented to the workers’ cognitive and physical demands. A modular system architecture is designed accordingly, which allows a flexible component exchange according to the user and the work task. Gamification, the use of game elements in non-gaming contexts, has been further adopted in this work to provide level-based instructions and personalised feedback. The developed framework is validated by applying it to a manual workstation for industrial assembly routines.
This work proposes a hybrid algorithm combining an Artificial Neural Network (ANN) with a conventional local path planner to navigate UAVs efficiently in various unknown urban environments. The proposed method of a Hybrid Artificial Neural Network Avoidance System is called HANNAS. The ANN analyses a video stream and classifies the current environment. This information about the current Environment is used to set several control parameters of a conventional local path planner, the 3DVFH*. The local path planner then plans the path toward a specific goal point based on distance data from a depth camera. We trained and tested a state-of-the-art image segmentation algorithm, PP-LiteSeg. The proposed HANNAS method reaches a failure probability of 17%, which is less than half the failure probability of the baseline and around half the failure probability of an improved, bio-inspired version of the 3DVFH*. The proposed HANNAS method does not show any disadvantages regarding flight time or flight distance.
This work presents the Multi-Bees-Tracker (MBT3D) algorithm, a Python framework implementing a deep association tracker for Tracking-By-Detection, to address the challenging task of tracking flight paths of bumblebees in a social group. While tracking algorithms for bumblebees exist, they often come with intensive restrictions, such as the need for sufficient lighting, high contrast between the animal and background, absence of occlusion, significant user input, etc. Tracking flight paths of bumblebees in a social group is challenging. They suddenly adjust movements and change their appearance during different wing beat states while exhibiting significant similarities in their individual appearance. The MBT3D tracker, developed in this research, is an adaptation of an existing ant tracking algorithm for bumblebee tracking. It incorporates an offline trained appearance descriptor along with a Kalman Filter for appearance and motion matching. Different detector architectures for upstream detections (You Only Look Once (YOLOv5), Faster Region Proposal Convolutional Neural Network (Faster R-CNN), and RetinaNet) are investigated in a comparative study to optimize performance. The detection models were trained on a dataset containing 11359 labeled bumblebee images. YOLOv5 reaches an Average Precision of AP = 53, 8%, Faster R-CNN achieves AP = 45, 3% and RetinaNet AP = 38, 4% on the bumblebee validation dataset, which consists of 1323 labeled bumblebee images. The tracker’s appearance model is trained on 144 samples. The tracker (with Faster R-CNN detections) reaches a Multiple Object Tracking Accuracy MOTA = 93, 5% and a Multiple Object Tracking Precision MOTP = 75, 6% on a validation dataset containing 2000 images, competing with state-of-the-art computer vision methods. The framework allows reliable tracking of different bumblebees in the same video stream with rarely occurring identity switches (IDS). MBT3D has much lower IDS than other commonly used algorithms, with one of the lowest false positive rates, competing with state-of-the-art animal tracking algorithms. The developed framework reconstructs the 3-dimensional (3D) flight paths of the bumblebees by triangulation. It also handles and compares two alternative stereo camera pairs if desired.
Muscle function is compromised by gravitational unloading in space affecting overall musculoskeletal health. Astronauts perform daily exercise programmes to mitigate these effects but knowing which muscles to target would optimise effectiveness. Accurate inflight assessment to inform exercise programmes is critical due to lack of technologies suitable for spaceflight. Changes in mechanical properties indicate muscle health status and can be measured rapidly and non-invasively using novel technology. A hand-held MyotonPRO device enabled monitoring of muscle health for the first time in spaceflight (> 180 days). Greater/maintained stiffness indicated countermeasures were effective. Tissue stiffness was preserved in the majority of muscles (neck, shoulder, back, thigh) but Tibialis Anterior (foot lever muscle) stiffness decreased inflight vs. preflight (p < 0.0001; mean difference 149 N/m) in all 12 crewmembers. The calf muscles showed opposing effects, Gastrocnemius increasing in stiffness Soleus decreasing. Selective stiffness decrements indicate lack of preservation despite daily inflight countermeasures. This calls for more targeted exercises for lower leg muscles with vital roles as ankle joint stabilizers and in gait. Muscle stiffness is a digital biomarker for risk monitoring during future planetary explorations (Moon, Mars), for healthcare management in challenging environments or clinical disorders in people on Earth, to enable effective tailored exercise programmes.
In this paper, the use of reinforcement learning (RL) in control systems is investigated using a rotatory inverted pendulum as an example. The control behavior of an RL controller is compared to that of traditional LQR and MPC controllers. This is done by evaluating their behavior under optimal conditions, their disturbance behavior, their robustness and their development process. All the investigated controllers are developed using MATLAB and the Simulink simulation environment and later deployed to a real pendulum model powered by a Raspberry Pi. The RL algorithm used is Proximal Policy Optimization (PPO). The LQR controller exhibits an easy development process, an average to good control behavior and average to good robustness. A linear MPC controller could show excellent results under optimal operating conditions. However, when subjected to disturbances or deviations from the equilibrium point, it showed poor performance and sometimes instable behavior. Employing a nonlinear MPC Controller in real time was not possible due to the high computational effort involved. The RL controller exhibits by far the most versatile and robust control behavior. When operated in the simulation environment, it achieved a high control accuracy. When employed in the real system, however, it only shows average accuracy and a significantly greater performance loss compared to the simulation than the traditional controllers. With MATLAB, it is not yet possible to directly post-train the RL controller on the Raspberry Pi, which is an obstacle to the practical application of RL in a prototyping or teaching setting. Nevertheless, RL in general proves to be a flexible and powerful control method, which is well suited for complex or nonlinear systems where traditional controllers struggle.
Drought and water shortage are serious problems in many arid and semi-arid regions. This problem is getting worse and even continues in temperate climatic regions due to climate change. To address this problem, the use of biodegradable hydrogels is increasingly important for the application as water-retaining additives in soil. Furthermore, efficient (micro-)nutrient supply can be provided by the use of tailored hydrogels. Biodegradable polyaspartic acid (PASP) hydrogels with different available (1,6-hexamethylene diamine (HMD) and L-lysine (LYS)) and newly developed crosslinkers based on diesters of glycine (GLY) and (di-)ethylene glycol (DEG and EG, respectively) were synthesized and characterized using Fourier transform infrared (FTIR) spectroscopy and scanning electron microscopy (SEM) and regarding their swelling properties (kinetic, absorbency under load (AUL)) as well as biodegradability of PASP hydrogel. Copper (II) and zinc (II), respectively, were loaded as micronutrients in two different approaches: in situ with crosslinking and subsequent loading of prepared hydrogels. The results showed successful syntheses of di-glycine-ester-based crosslinkers. Hydrogels with good water-absorbing properties were formed. Moreover, the developed crosslinking agents in combination with the specific reaction conditions resulted in higher water absorbency with increased crosslinker content used in synthesis (10% vs. 20%). The prepared hydrogels are candidates for water-storing soil additives due to the biodegradability of PASP, which is shown in an exemple. The incorporation of Cu(II) and Zn(II) ions can provide these micronutrients for plant growth.
To gain insight on chemical sterilization processes, the influence of temperature (up to 70 °C), intense green light, and hydrogen peroxide (H₂O₂) concentration (up to 30% in aqueous solution) on microbial spore inactivation is evaluated by in-situ Raman spectroscopy with an optical trap. Bacillus atrophaeus is utilized as a model organism. Individual spores are isolated and their chemical makeup is monitored under dynamically changing conditions (temperature, light, and H₂O₂ concentration) to mimic industrially relevant process parameters for sterilization in the field of aseptic food processing. While isolated spores in water are highly stable, even at elevated temperatures of 70 °C, exposure to H₂O₂ leads to a loss of spore integrity characterized by the release of the key spore biomarker dipicolinic acid (DPA) in a concentration-dependent manner, which indicates damage to the inner membrane of the spore. Intensive light or heat, both of which accelerate the decomposition of H₂O₂ into reactive oxygen species (ROS), drastically shorten the spore lifetime, suggesting the formation of ROS as a rate-limiting step during sterilization. It is concluded that Raman spectroscopy can deliver mechanistic insight into the mode of action of H₂O₂-based sterilization and reveal the individual contributions of different sterilization methods acting in tandem.
Many important properties of bacterial cellulose (BC), such as moisture absorption capacity, elasticity and tensile strength, largely depend on its structure. This paper presents a study on the effect of the drying method on BC films produced by Medusomyces gisevii using two different procedures: room temperature drying (RT, (24 ± 2 °C, humidity 65 ± 1%, dried until a constant weight was reached) and freeze-drying (FD, treated at − 75 °C for 48 h). BC was synthesized using one of two different carbon sources—either glucose or sucrose. Structural differences in the obtained BC films were evaluated using atomic force microscopy (AFM), scanning electron microscopy (SEM), and X-ray diffraction. Macroscopically, the RT samples appeared semi-transparent and smooth, whereas the FD group exhibited an opaque white color and sponge-like structure. SEM examination showed denser packing of fibrils in FD samples while RT-samples displayed smaller average fiber diameter, lower surface roughness and less porosity. AFM confirmed the SEM observations and showed that the FD material exhibited a more branched structure and a higher surface roughness. The samples cultivated in a glucose-containing nutrient medium, generally displayed a straight and ordered shape of fibrils compared to the sucrose-derived BC, characterized by a rougher and wavier structure. The BC films dried under different conditions showed distinctly different crystallinity degrees, whereas the carbon source in the culture medium was found to have a relatively small effect on the BC crystallinity.
Frequency mixing magnetic detection (FMMD) is a sensitive and selective technique to detect magnetic nanoparticles (MNPs) serving as probes for binding biological targets. Its principle relies on the nonlinear magnetic relaxation dynamics of a particle ensemble interacting with a dual frequency external magnetic field. In order to increase its sensitivity, lower its limit of detection and overall improve its applicability in biosensing, matching combinations of external field parameters and internal particle properties are being sought to advance FMMD. In this study, we systematically probe the aforementioned interaction with coupled Néel–Brownian dynamic relaxation simulations to examine how key MNP properties as well as applied field parameters affect the frequency mixing signal generation. It is found that the core size of MNPs dominates their nonlinear magnetic response, with the strongest contributions from the largest particles. The drive field amplitude dominates the shape of the field-dependent response, whereas effective anisotropy and hydrodynamic size of the particles only weakly influence the signal generation in FMMD. For tailoring the MNP properties and parameters of the setup towards optimal FMMD signal generation, our findings suggest choosing large particles of core sizes dc > 25 nm nm with narrow size distributions (σ < 0.1) to minimize the required drive field amplitude. This allows potential improvements of FMMD as a stand-alone application, as well as advances in magnetic particle imaging, hyperthermia and magnetic immunoassays.
The deformation and damage laws of non-homogeneous irregular structural planes in rocks are the basis for studying the stability of rock engineering. To investigate the damage characteristics of rock containing non-parallel fissures, uniaxial compression tests and numerical simulations were conducted on sandstone specimens containing three non-parallel fissures inclined at 0°, 45° and 90° in this study. The characteristics of crack initiation and crack evolution of fissures with different inclinations were analyzed. A constitutive model for the discontinuous fractures of fissured sandstone was proposed. The results show that the fracture behaviors of fissured sandstone specimens are discontinuous. The stress–strain curves are non-smooth and can be divided into nonlinear crack closure stage, linear elastic stage, plastic stage and brittle failure stage, of which the plastic stage contains discontinuous stress drops. During the uniaxial compression test, the middle or ends of 0° fissures were the first to crack compared to 45° and 90° fissures. The end with small distance between 0° and 45° fissures cracked first, and the end with large distance cracked later. After the final failure, 0° fissures in all specimens were fractured, while 45° and 90° fissures were not necessarily fractured. Numerical simulation results show that the concentration of compressive stress at the tips of 0°, 45° and 90° fissures, as well as the concentration of tensile stress on both sides, decreased with the increase of the inclination angle. A constitutive model for the discontinuous fractures of fissured sandstone specimens was derived by combining the logistic model and damage mechanic theory. This model can well describe the discontinuous drops of stress and agrees well with the whole processes of the stress–strain curves of the fissured sandstone specimens.
Analyzing electroencephalographic (EEG) time series can be challenging, especially with deep neural networks, due to the large variability among human subjects and often small datasets. To address these challenges, various strategies, such as self-supervised learning, have been suggested, but they typically rely on extensive empirical datasets. Inspired by recent advances in computer vision, we propose a pretraining task termed "frequency pretraining" to pretrain a neural network for sleep staging by predicting the frequency content of randomly generated synthetic time series. Our experiments demonstrate that our method surpasses fully supervised learning in scenarios with limited data and few subjects, and matches its performance in regimes with many subjects. Furthermore, our results underline the relevance of frequency information for sleep stage scoring, while also demonstrating that deep neural networks utilize information beyond frequencies to enhance sleep staging performance, which is consistent with previous research. We anticipate that our approach will be advantageous across a broad spectrum of applications where EEG data is limited or derived from a small number of subjects, including the domain of brain-computer interfaces.
The artificial olfactory image was proposed by Lundström et al. in 1991 as a new strategy for an electronic nose system which generated a two-dimensional mapping to be interpreted as a fingerprint of the detected gas species. The potential distribution generated by the catalytic metals integrated into a semiconductor field-effect structure was read as a photocurrent signal generated by scanning light pulses. The impact of the proposed technology spread beyond gas sensing, inspiring the development of various imaging modalities based on the light addressing of field-effect structures to obtain spatial maps of pH distribution, ions, molecules, and impedance, and these modalities have been applied in both biological and non-biological systems. These light-addressing technologies have been further developed to realize the position control of a faradaic current on the electrode surface for localized electrochemical reactions and amperometric measurements, as well as the actuation of liquids in microfluidic devices.
Aircraft configurations with propellers have been drawing more attention in recent times, partly due to new propulsion concepts based on hydrogen fuel cells and electric motors. These configurations are prone to whirl flutter, which is an aeroelastic instability affecting airframes with elastically supported propellers. It commonly needs to be mitigated already during the design phase of such configurations, requiring, among other things, unsteady aerodynamic transfer functions for the propeller. However, no comprehensive assessment of unsteady propeller aerodynamics for aeroelastic analysis is available in the literature. This paper provides a detailed comparison of nine different low- to mid-fidelity aerodynamic methods, demonstrating their impact on linear, unsteady aerodynamics, as well as whirl flutter stability prediction. Quasi-steady and unsteady methods for blade lift with or without coupling to blade element momentum theory are evaluated and compared to mid-fidelity potential flow solvers (UPM and DUST) and classical, derivative-based methods. Time-domain identification of frequency-domain transfer functions for the unsteady propeller hub loads is used to compare the different methods. Predictions of the minimum required pylon stiffness for stability show good agreement among the mid-fidelity methods. The differences in the stability predictions for the low-fidelity methods are higher. Most methods studied yield a more unstable system than classical, derivative-based whirl flutter analysis, indicating that the use of more sophisticated aerodynamic modeling techniques might be required for accurate whirl flutter prediction.
Magnetic nanoparticles (MNP) are investigated with great interest for biomedical applications in diagnostics (e.g. imaging: magnetic particle imaging (MPI)), therapeutics (e.g. hyperthermia: magnetic fluid hyperthermia (MFH)) and multi-purpose biosensing (e.g. magnetic immunoassays (MIA)). What all of these applications have in common is that they are based on the unique magnetic relaxation mechanisms of MNP in an alternating magnetic field (AMF). While MFH and MPI are currently the most prominent examples of biomedical applications, here we present results on the relatively new biosensing application of frequency mixing magnetic detection (FMMD) from a simulation perspective. In general, we ask how the key parameters of MNP (core size and magnetic anisotropy) affect the FMMD signal: by varying the core size, we investigate the effect of the magnetic volume per MNP; and by changing the effective magnetic anisotropy, we study the MNPs’ flexibility to leave its preferred magnetization direction. From this, we predict the most effective combination of MNP core size and magnetic anisotropy for maximum signal generation.
Pulmonary arterial cannulation is a common and effective method for percutaneous mechanical circulatory support for concurrent right heart and respiratory failure [1]. However, limited data exists to what effect the positioning of the cannula has on the oxygen perfusion throughout the pulmonary artery (PA). This study aims to evaluate, using computational fluid dynamics (CFD), the effect of different cannula positions in the PA with respect to the oxygenation of the different branching vessels in order for an optimal cannula position to be determined. The four chosen different positions (see Fig. 1) of the cannulas are, in the lower part of the main pulmonary artery (MPA), in the MPA at the junction between the right pulmonary artery (RPA) and the left pulmonary artery (LPA), in the RPA at the first branch of the RPA and in the LPA at the first branch of the LPA.
Humic substances possess distinctive chemical features enabling their use in many advanced applications, including biomedical fields. No chemicals in nature have the same combination of specific chemical and biological properties as humic substances. Traditional medicine and modern research have demonstrated that humic substances from different sources possess immunomodulatory and anti-inflammatory properties, which makes them suitable for the prevention and treatment of chronic dermatoses, allergic rhinitis, atopic dermatitis, and other conditions characterized by inflammatory and allergic responses [1-4]. The use of humic compounds as agentswith antifungal and antiviral properties shows great potential [5-7].
This study presents the concept of AstroBioLab, an autonomous astrobiological field laboratory tailored for the exploration of (sub)glacial habitats. AstroBioLab is an integral component of the TRIPLE (Technologies for Rapid Ice Penetration and subglacial Lake Exploration) DLR-funded project, aimed at advancing astrobiology research through the development and deployment of innovative technologies. AstroBioLab integrates diverse measurement techniques such as fluorescence microscopy, DNA sequencing and fluorescence spectrometry, while leveraging microfluidics for efficient sample delivery and preparation.
This easy-to-understand introduction to SAP S/4HANA guides you through the central processes in sales, purchasing and procurement, finance, production, and warehouse management using the model company Global Bike. Familiarize yourself with the basics of business administration, the relevant organizational data, master data, and transactional data, as well as a selection of core business processes in SAP. Using practical examples and tutorials, you will soon become an SAP S/4HANA professional!
Tutorials and exercises for beginners, advanced users, and experts make it easy for you to practice your new knowledge. The prerequisite for this book is access to an SAP S/4HANA client with Global Bike version 4.1.
- Business fundamentals and processes in the SAP system
- Sales, purchasing and procurement, production, finance, and warehouse management
- Tutorials at different qualification levels, exercises, and recap of case studies
- Includes extensive download material for students, lecturers, and professors
A novel method to determine the extruded length of a metallic wire for a directed energy deposition (DED) process using a microwave (MW) plasma jet with a straight-through wire feed is presented. The method is based on the relative comparison of the measured frequency response obtained by the large-signal scattering parameter (Hot-S) technique. In the practical working range, repeatability of less than 6% for a nonactive plasma and 9% for the active plasma state is found. Measurements are conducted with a focus on a simple solution to decrease the processing time and reduce the integration time of the process into the existing hardware. It is shown that monitoring a single frequency for magnitude and phase changes is sufficient to achieve good accuracy. A combination of different measurement values to determine the length is possible. The applicability to different diameter of the same material is shown as well as a contact detection of the wire and metallic substrate.
This article addresses the need for an innovative technique in plasma shaping, utilizing antenna structures, Maxwell’s laws, and boundary conditions within a shielded environment. The motivation lies in exploring a novel approach to efficiently generate high-energy density plasma with potential applications across various fields. Implemented in an E01 circular cavity resonator, the proposed method involves the use of an impedance and field matching device with a coaxial connector and a specially optimized monopole antenna. This setup feeds a low-loss cavity resonator, resulting in a high-energy density air plasma with a surface temperature exceeding 3500 o C, achieved with a minimal power input of 80 W. The argon plasma, resembling the shape of a simple monopole antenna with modeled complex dielectric values, offers a more energy-efficient alternative compared to traditional, power-intensive plasma shaping methods. Simulations using a commercial electromagnetic (EM) solver validate the design’s effectiveness, while experimental validation underscores the method’s feasibility and practical implementation. Analyzing various parameters in an argon atmosphere, including hot S -parameters and plasma beam images, the results demonstrate the successful application of this technique, suggesting its potential in coating, furnace technology, fusion, and spectroscopy applications.