Refine
Year of publication
- 2023 (92) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (22)
- Fachbereich Elektrotechnik und Informationstechnik (20)
- ECSM European Center for Sustainable Mobility (16)
- Fachbereich Luft- und Raumfahrttechnik (16)
- Fachbereich Chemie und Biotechnologie (10)
- Fachbereich Energietechnik (10)
- Fachbereich Wirtschaftswissenschaften (7)
- IfB - Institut für Bioengineering (7)
- Fachbereich Maschinenbau und Mechatronik (6)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (6)
Has Fulltext
- no (92) (remove)
Language
- English (92) (remove)
Document Type
- Article (43)
- Conference Proceeding (35)
- Part of a Book (6)
- Habilitation (2)
- Preprint (2)
- Book (1)
- Conference: Meeting Abstract (1)
- Contribution to a Periodical (1)
- Talk (1)
Keywords
- Natural language processing (3)
- Associated liquids (2)
- Diversity Management (2)
- Engineering Habitus (2)
- Future Skills (2)
- Information extraction (2)
- Interdisciplinarity (2)
- Organizational Culture (2)
- Power plants (2)
- Sustainability (2)
The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems.
By developing innovative solutions to social and environmental problems, sustainable ventures carry greatpotential. Entrepreneurship which focuses especially on new venture creation can be developed through education anduniversities, in particular, are called upon to provide an impetus for social change. But social innovations are associatedwith certain hurdles, which are related to the multi-dimensionality, i.e. the tension between creating social,environmental and economic value and dealing with a multiplicity of stakeholders. The already complex field ofentrepreneurship education has to face these challenges. This paper, therefore, aims to identify starting points for theintegration of sustainability into entrepreneurship education. To pursue this goal experiences from three differentproject initiatives between the partner universities: Lapland University of Applied Sciences, FH Aachen University ofApplied Sciences and Turiba University are reflected and findings are systematically condensed into recommendationsfor education on sustainable entrepreneurship.
This thesis aims at the presentation and discussion of well-accepted and new
imaging techniques applied to different types of flow in common hydraulic
engineering environments. All studies are conducted in laboratory conditions and
focus on flow depth and velocity measurements. Investigated flows cover a wide
range of complexity, e.g. propagation of waves, dam-break flows, slightly and fully
aerated spillway flows as well as highly turbulent hydraulic jumps.
Newimagingmethods are compared to different types of sensorswhich are frequently
employed in contemporary laboratory studies. This classical instrumentation as well
as the general concept of hydraulic modeling is introduced to give an overview on
experimental methods.
Flow depths are commonly measured by means of ultrasonic sensors, also known as
acoustic displacement sensors. These sensors may provide accurate data with high
sample rates in case of simple flow conditions, e.g. low-turbulent clear water flows.
However, with increasing turbulence, higher uncertainty must be considered.
Moreover, ultrasonic sensors can provide point data only, while the relatively large
acoustic beam footprint may lead to another source of uncertainty in case of
relatively short, highly turbulent surface fluctuations (ripples) or free-surface
air-water flows. Analysis of turbulent length and time scales of surface fluctuations
from point measurements is also difficult. Imaging techniques with different
dimensionality, however, may close this gap. It is shown in this thesis that edge
detection methods (known from computer vision) may be used for two-dimensional
free-surface extraction (i.e. from images taken through transparant sidewalls in
laboratory flumes). Another opportunity in hydraulic laboratory studies comes with
the application of stereo vision. Low-cost RGB-D sensors can be used to gather
instantaneous, three-dimensional free-surface elevations, even in flows with very
high complexity (e.g. aerated hydraulic jumps). It will be shown that the uncertainty
of these methods is of similar order as for classical instruments.
Particle Image Velocimetry (PIV) is a well-accepted and widespread imaging
technique for velocity determination in laboratory conditions. In combination with
high-speed cameras, PIV can give time-resolved velocity fields in 2D/3D or even as
volumetric flow fields. PIV is based on a cross-correlation technique applied to small
subimages of seeded flows. The minimum size of these subimages defines the
maximum spatial resolution of resulting velocity fields. A derivative of PIV for
aerated flows is also available, i.e. the so-called Bubble Image Velocimetry (BIV). This
thesis emphasizes the capacities and limitations of both methods, using relatively
simple setups with halogen and LED illuminations. It will be demonstrated that
PIV/BIV images may also be processed by means of Optical Flow (OF) techniques.
OF is another method originating from the computer vision discipline, based on the
assumption of image brightness conservation within a sequence of images. The
Horn-Schunck approach, which has been first employed to hydraulic engineering
problems in the studies presented herein, yields dense velocity fields, i.e. pixelwise
velocity data. As discussed hereinafter, the accuracy of OF competes well with PIV
for clear-water flows and even improves results (compared to BIV) for aerated flow
conditions. In order to independently benchmark the OF approach, synthetic images
with defined turbulence intensitiy are used.
Computer vision offers new opportunities that may help to improve the
understanding of fluid mechanics and fluid-structure interactions in laboratory
investigations. In prototype environments, it can be employed for obstacle detection
(e.g. identification of potential fish migration corridors) and recognition (e.g. fish
species for monitoring in a fishway) or surface reconstruction (e.g. inspection of
hydraulic structures). It can thus be expected that applications to hydraulic
engineering problems will develop rapidly in near future. Current methods have not
been developed for fluids in motion. Systematic future developments are needed to
improve the results in such difficult conditions.
The major advantage of labyrinth weirs over linear weirs is hydraulic efficiency. In hydraulic modeling efforts, this strength contrasts with limited pump capacity as well as limited computational power for CFD simulations. For the latter, reducing the number of investigated cycles can significantly reduce necessary computational time. In this study, a labyrinth weir with different cycle numbers was investigated. The simulations were conducted in FLOW-3D HYDRO as a Large Eddy Simulation. With a mean deviation of 1.75 % between simulated discharge coefficients and literature design equations, a reasonable agreement was found. For downstream conditions, overall consistent results were observed as well. However, the orientation of labyrinth weirs with a single cycle should be chosen carefully under consideration of the individual research purpose.
Today’s society is undergoing a paradigm shift driven by the megatrend of sustainability. This undeniably affects all areas of Western life. This paper aims to find out how the luxury industry is dealing with this change and what adjustments are made by the companies. For this purpose, interviews were conducted with managers from the luxury industry, in which they were asked about specific measures taken by their companies as well as trends in the industry. In a subsequent evaluation, the trends in the luxury industry were summarized for the areas of ecological, social, and economic sustainability. It was found that the area of environmental sustainability is significantly more focused than the other sub-areas. Furthermore, the need for a customer survey to validate the industry-based measures was identified.
Messenger apps like WhatsApp and Telegram are frequently used for everyday communication, but they can also be utilized as a platform for illegal activity. Telegram allows public groups with up to 200.000 participants. Criminals use these public groups for trading illegal commodities and services, which becomes a concern for law enforcement agencies, who manually monitor suspicious activity in these chat rooms. This research demonstrates how natural language processing (NLP) can assist in analyzing these chat rooms, providing an explorative overview of the domain and facilitating purposeful analyses of user behavior. We provide a publicly available corpus of annotated text messages with entities and relations from four self-proclaimed black market chat rooms. Our pipeline approach aggregates the extracted product attributes from user messages to profiles and uses these with their sold products as features for clustering. The extracted structured information is the foundation for further data exploration, such as identifying the top vendors or fine-granular price analyses. Our evaluation shows that pretrained word vectors perform better for unsupervised clustering than state-of-the-art transformer models, while the latter is still superior for sequence labeling.
Deammonification for nitrogen removal in municipal wastewater in temperate and cold climate zones is currently limited to the side stream of municipal wastewater treatment plants (MWWTP). This study developed a conceptual model of a mainstream deammonification plant, designed for 30,000 P.E., considering possible solutions corresponding to the challenging mainstream conditions in Germany. In addition, the energy-saving potential, nitrogen elimination performance and construction-related costs of mainstream deammonification were compared to a conventional plant model, having a single-stage activated sludge process with upstream denitrification. The results revealed that an additional treatment step by combining chemical precipitation and ultra-fine screening is advantageous prior the mainstream deammonification. Hereby chemical oxygen demand (COD) can be reduced by 80% so that the COD:N ratio can be reduced from 12 to 2.5. Laboratory experiments testing mainstream conditions of temperature (8–20°C), pH (6–9) and COD:N ratio (1–6) showed an achievable volumetric nitrogen removal rate (VNRR) of at least 50 gN/(m3∙d) for various deammonifying sludges from side stream deammonification systems in the state of North Rhine-Westphalia, Germany, where m3 denotes reactor volume. Assuming a retained Norganic content of 0.0035 kgNorg./(P.E.∙d) from the daily loads of N at carbon removal stage and a VNRR of 50 gN/(m3∙d) under mainstream conditions, a resident-specific reactor volume of 0.115 m3/(P.E.) is required for mainstream deammonification. This is in the same order of magnitude as the conventional activated sludge process, i.e., 0.173 m3/(P.E.) for an MWWTP of size class of 4. The conventional plant model yielded a total specific electricity demand of 35 kWh/(P.E.∙a) for the operation of the whole MWWTP and an energy recovery potential of 15.8 kWh/(P.E.∙a) through anaerobic digestion. In contrast, the developed mainstream deammonification model plant would require only a 21.5 kWh/(P.E.∙a) energy demand and result in 24 kWh/(P.E.∙a) energy recovery potential, enabling the mainstream deammonification model plant to be self-sufficient. The retrofitting costs for the implementation of mainstream deammonification in existing conventional MWWTPs are nearly negligible as the existing units like activated sludge reactors, aerators and monitoring technology are reusable. However, the mainstream deammonification must meet the performance requirement of VNRR of about 50 gN/(m3∙d) in this case.
This paper describes the potential for developing a digital twin of society- a dynamic model that can be used to observe, analyze, and predict the evolution of various societal aspects. Such a digital twin can help governmental agencies and policy makers in interpreting trends, understanding challenges, and making decisions regarding investments or policies necessary to support societal development and ensure future prosperity. The paper reviews related work regarding the digital twin paradigm and its applications. The paper presents a motivating case study- an analysis of opportunities and challenges faced by the German federal employment agency, Bundesagentur f¨ur Arbeit (BA), proposes solutions using digital twins, and describes initial proofs of concept for such solutions.
Ice melting probes
(2023)
The exploration of icy environments in the solar system, such as the poles of Mars and the icy moons (a.k.a. ocean worlds), is a key aspect for understanding their astrobiological potential as well as for extraterrestrial resource inspection. On these worlds, ice melting probes are considered to be well suited for the robotic clean execution of such missions. In this chapter, we describe ice melting probes and their applications, the physics of ice melting and how the melting behavior can be modeled and simulated numerically, the challenges for ice melting, and the required key technologies to deal with those challenges. We also give an overview of existing ice melting probes and report some results and lessons learned from laboratory and field tests.
Experimental determination of the cross sections of proton capture on radioactive nuclei is extremely difficult. Therefore, it is of substantial interest for the understanding of the production of the p-nuclei. For the first time, a direct measurement of proton-capture cross sections on stored, radioactive ions became possible in an energy range of interest for nuclear astrophysics. The experiment was performed at the Experimental Storage Ring (ESR) at GSI by making use of a sensitive method to measure (p,γ) and (p,n) reactions in inverse kinematics. These reaction channels are of high relevance for the nucleosyn-thesis processes in supernovae, which are among the most violent explosions in the universe and are not yet well understood. The cross section of the ¹¹⁸Te(p,γ) reaction has been measured at energies of 6 MeV/u and 7 MeV/u. The heavy ions interacted with a hydrogen gas jet target. The radiative recombination process of the fully stripped ¹¹⁸Te ions and electrons from the hydrogen target was used as a luminosity monitor. An overview of the experimental method and preliminary results from the ongoing analysis will be presented.
Meitner-Auger-electron emitters have a promising potential for targeted radionuclide therapy of cancer because of their short range and the high linear energy transfer of Meitner-Auger-electrons (MAE). One promising MAE candidate is 197m/gHg with its half-life of 23.8 h and 64.1 h, respectively, and high MAE yield. Gold nanoparticles (AuNPs) that are labelled with 197m/gHg could be a helpful tool for radiation treatment of glioblastoma multiforme when infused into the surgical cavity after resection to prevent recurrence. To produce such AuNPs, 197m/gHg was embedded into pristine AuNPs. Two different syntheses were tested starting from irradiated gold containing trace amounts of 197m/gHg. When sodium citrate was used as reducing agent, no 197m/gHg labelled AuNPs were formed, but with tannic acid, 197m/gHg labeled AuNPs were produced. The method was optimized by neutralizing the pH (pH = 7) of the Au/197m/gHg solution, which led to labelled AuNPs with a size of 12.3 ± 2.0 nm as measured by transmission electron microscopy. The labelled AuNPs had a concentration of 50 μg (gold)/mL with an activity of 151 ± 93 kBq/mL (197gHg, time corrected to the end of bombardment).
The popularity of social media and particularly Instagram grows steadily. People use the different platforms to share pictures as well as videos and to communicate with friends. The potential of social media platforms is also being used for marketing purposes and for selling products. While for Facebook and other online social media platforms the purchase decision factors are investigated several times, Instagram stores remain mainly unattended so far. The present research work closes this gap and sheds light into decisive factors for purchasing products offered in Instagram stores. A theoretical research model, which contains selected constructs that are assumed to have a significant influence on Instagram user´s purchase intention, is developed. The hypotheses are evaluated by applying structural equation modelling on survey data containing 127 relevant participants. The results of the study reveal that ‘trust’, ‘personal recommendation’, and ‘usability’ significantly influences user’s buying intention in Instagram stores.
Software development projects often fail because of insufficient code quality. It is now well documented that the task of testing software, for example, is perceived as uninteresting and rather boring, leading to poor software quality and major challenges to software development companies. One promising approach to increase the motivation for considering software quality is the use of gamification. Initial research works already investigated the effects of gamification on software developers and come to promising. Nevertheless, a lack of results from field experiments exists, which motivates the chapter at hand. By conducting a gamification experiment with five student software projects and by interviewing the project members, the chapter provides insights into the changing programming behavior of information systems students when confronted with a leaderboard. The results reveal a motivational effect as well as a reduction of code smells.
Autonomous agents require rich environment models for fulfilling their missions. High-definition maps are a well-established map format which allows for representing semantic information besides the usual geometric information of the environment. These are, for instance, road shapes, road markings, traffic signs or barriers. The geometric resolution of HD maps can be as precise as of centimetre level. In this paper, we report on our approach of using HD maps as a map representation for autonomous load-haul-dump vehicles in open-pit mining operations. As the mine undergoes constant change, we also need to constantly update the map. Therefore, we follow a lifelong mapping approach for updating the HD maps based on camera-based object detection and GPS data. We show our mapping algorithm based on the Lanelet 2 map format and show our integration with the navigation stack of the Robot Operating System. We present experimental results on our lifelong mapping approach from a real open-pit mine.
Throughout the last decade, and particularly in 2022, water scarcity has become a critical concern in Morocco and other Mediterranean countries. The lack of rainfall during spring was worsened by a succession of heat waves during the summer. To address this drought, innovative solutions, including the use of new technologies such as hydrogels, will be essential to transform agriculture. This paper presents the findings of a study that evaluated the impact of hydrogel application on onion (Allium cepa) cultivation in Meknes, Morocco. The treatments investigated in this study comprised two different types of hydrogel-based soil additives (Arbovit® polyacrylate and Huminsorb® polyacrylate), applied at two rates (30 and 20 kg/ha), and irrigated at two levels of water supply (100% and 50% of daily crop evapotranspiration; ETc). Two control treatments were included, without hydrogel application and with both water amounts. The experiment was conducted in an open field using a completely randomized design. The results indicated a significant impact of both hydrogel-type dose and water dose on onion plant growth, as evidenced by various vegetation parameters. Among the hydrogels tested, Huminsorb® Polyacrylate produced the most favorable outcomes, with treatment T9 (100%, HP, 30 kg/ha) yielding 70.55 t/ha; this represented an increase of 11 t/ha as compared to the 100% ETc treatment without hydrogel application. Moreover, the combination of hydrogel application with 50% ETc water stress showed promising results, with treatment T4 (HP, 30 kg, 50%) producing almost the same yield as the 100% ETc treatment without hydrogel while saving 208 mm of water.
Proteins are important ingredients in food and feed, they are the active components of many pharmaceutical products, and they are necessary, in the form of enzymes, for the success of many technical processes. However, production can be challenging, especially when using heterologous host cells such as bacteria to express and assemble recombinant mammalian proteins. The manufacturability of proteins can be hindered by low solubility, a tendency to aggregate, or inefficient purification. Tools such as in silico protein engineering and models that predict separation criteria can overcome these issues but usually require the complex shape and surface properties of proteins to be represented by a small number of quantitative numeric values known as descriptors, as similarly used to capture the features of small molecules. Here, we review the current status of protein descriptors, especially for application in quantitative structure activity relationship (QSAR) models. First, we describe the complexity of proteins and the properties that descriptors must accommodate. Then we introduce descriptors of shape and surface properties that quantify the global and local features of proteins. Finally, we highlight the current limitations of protein descriptors and propose strategies for the derivation of novel protein descriptors that are more informative.
Environmental emissions, global warming, and energy-related concerns have accelerated the advancements in conventional vehicles that primarily use internal combustion engines. Among the existing technologies, hydrogen fuel cell electric vehicles and fuel cell hybrid electric vehicles may have minimal contributions to greenhouse gas emissions and thus are the prime choices for environmental concerns. However, energy management in fuel cell electric vehicles and fuel cell hybrid electric vehicles is a major challenge. Appropriate control strategies should be used for effective energy management in these vehicles. On the other hand, there has been significant progress in artificial intelligence, machine learning, and designing data-driven intelligent controllers. These techniques have found much attention within the community, and state-of-the-art energy management technologies have been developed based on them. This manuscript reviews the application of machine learning and intelligent controllers for prediction, control, energy management, and vehicle to everything (V2X) in hydrogen fuel cell vehicles. The effectiveness of data-driven control and optimization systems are investigated to evolve, classify, and compare, and future trends and directions for sustainability are discussed.
AI-based systems are nearing ubiquity not only in everyday low-stakes activities but also in medical procedures. To protect patients and physicians alike, explainability requirements have been proposed for the operation of AI-based decision support systems (AI-DSS), which adds hurdles to the productive use of AI in clinical contexts. This raises two questions: Who decides these requirements? And how should access to AI-DSS be provided to communities that reject these standards (particularly when such communities are expert-scarce)? This chapter investigates a dilemma that emerges from the implementation of global AI governance. While rejecting global AI governance limits the ability to help communities in need, global AI governance risks undermining and subjecting health-insecure communities to the force of the neo-colonial world order. For this, this chapter first surveys the current landscape of AI governance and introduces the approach of relational egalitarianism as key to (global health) justice. To discuss the two horns of the referred dilemma, the core power imbalances faced by health-insecure collectives (HICs) are examined. The chapter argues that only strong demands of a dual strategy towards health-secure collectives can both remedy the immediate needs of HICs and enable them to become healthcare independent.
Extracting workflow nets from textual descriptions can be used to simplify guidelines or formalize textual descriptions of formal processes like business processes and algorithms. The task of manually extracting processes, however, requires domain expertise and effort. While automatic process model extraction is desirable, annotating texts with formalized process models is expensive. Therefore, there are only a few machine-learning-based extraction approaches. Rule-based approaches, in turn, require domain specificity to work well and can rarely distinguish relevant and irrelevant information in textual descriptions. In this paper, we present GUIDO, a hybrid approach to the process model extraction task that first, classifies sentences regarding their relevance to the process model, using a BERT-based sentence classifier, and second, extracts a process model from the sentences classified as relevant, using dependency parsing. The presented approach achieves significantly better resul ts than a pure rule-based approach. GUIDO achieves an average behavioral similarity score of 0.93. Still, in comparison to purely machine-learning-based approaches, the annotation costs stay low.