Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (2080)
- Fachbereich Elektrotechnik und Informationstechnik (1184)
- Fachbereich Wirtschaftswissenschaften (1164)
- Fachbereich Energietechnik (1116)
- Fachbereich Chemie und Biotechnologie (917)
- Fachbereich Maschinenbau und Mechatronik (879)
- Fachbereich Luft- und Raumfahrttechnik (798)
- Fachbereich Bauingenieurwesen (711)
- IfB - Institut für Bioengineering (690)
- INB - Institut für Nano- und Biotechnologien (616)
Language
- German (5134)
- English (4935)
- Russian (14)
- Portuguese (6)
- Italian (5)
- Multiple languages (5)
- Spanish (3)
- Dutch (2)
Document Type
- Article (5659)
- Conference Proceeding (1650)
- Book (1085)
- Part of a Book (571)
- Bachelor Thesis (327)
- Patent (177)
- Report (102)
- Doctoral Thesis (82)
- Conference: Meeting Abstract (76)
- Other (76)
Keywords
- Amtliche Mitteilung (71)
- Bachelor (33)
- Aachen University of Applied Sciences (31)
- Master (31)
- Prüfungsordnung (31)
- Bauingenieurwesen (30)
- Lesbare Fassung (28)
- Biosensor (25)
- Fachhochschule Aachen (23)
- Illustration (23)
Zugriffsart
- campus (2155)
- weltweit (1887)
- bezahl (783)
- fachbereichsweit (FB4) (34)
Nachhaltige Technologien, die Ressourcen schonen und Energie gewinnen, erlangen zunehmend an Bedeutung im urbanen Raum. Diese Bachelorarbeit befasst sich mit der Entwicklung eines Corporate Designs für ein Unternehmen, das sich auf die Fertigung und Planung von Bioenergiefassaden spezialisiert hat. Das Unternehmen sorgt durch seinen Fokus auf nachhaltige Energiegewinnung und effiziente Gebäudeplanung für die Verbesserung der ökologischen Herausforderungen. Das Ziel des neuen Corporate Designs ist es, die komplexe Thematik der Bioenergiefassaden der Zielgruppe effektiv zu vermitteln und ihr Interesse für dieses System zu wecken. Dabei werden Illustrationen und Infografiken eingesetzt, um die Technologie verständlich darzustellen, die positiven Umweltauswirkungen sowie Vorteile der Bioenergiefassaden deutlich hervorzuheben und mehr Umsetzungen zu erzielen.
Due to the increasing complexity of software projects, software development is becoming more and more dependent on teams. The quality of this teamwork can vary depending on the team composition, as teams are always a combination of different skills and personality types. This paper aims to answer the question of how to describe a software development team and what influence the personality of the team members has on the team dynamics. For this purpose, a systematic literature review (n=48) and a literature search with the AI research assistant Elicit (n=20) were conducted. Result: A person’s personality significantly shapes his or her thinking and actions, which in turn influences his or her behavior in software development teams. It has been shown that team performance and satisfaction can be strongly influenced by personality. The quality of communication and the likelihood of conflict can also be attributed to personality.
This paper presents an approach for reducing the cognitive load for humans working in quality control (QC) for production processes that adhere to the 6σ -methodology. While 100% QC requires every part to be inspected, this task can be reduced when a human-in-the-loop QC process gets supported by an anomaly detection system that only presents those parts for manual inspection that have a significant likelihood of being defective. This approach shows good results when applied to image-based QC for metal textile products.
Experimental determination of the cross sections of proton capture on radioactive nuclei is extremely difficult. Therefore, it is of substantial interest for the understanding of the production of the p-nuclei. For the first time, a direct measurement of proton-capture cross sections on stored, radioactive ions became possible in an energy range of interest for nuclear astrophysics. The experiment was performed at the Experimental Storage Ring (ESR) at GSI by making use of a sensitive method to measure (p,γ) and (p,n) reactions in inverse kinematics. These reaction channels are of high relevance for the nucleosyn-thesis processes in supernovae, which are among the most violent explosions in the universe and are not yet well understood. The cross section of the ¹¹⁸Te(p,γ) reaction has been measured at energies of 6 MeV/u and 7 MeV/u. The heavy ions interacted with a hydrogen gas jet target. The radiative recombination process of the fully stripped ¹¹⁸Te ions and electrons from the hydrogen target was used as a luminosity monitor. An overview of the experimental method and preliminary results from the ongoing analysis will be presented.
Digital forensics of smartphones is of utmost importance in many criminal cases. As modern smartphones store chats, photos, videos etc. that can be relevant for investigations and as they can have storage capacities of hundreds of gigabytes, they are a primary target for forensic investigators. However, it is exactly this large amount of data that is causing problems: extracting and examining the data from multiple phones seized in the context of a case is taking more and more time. This bears the risk of wasting a lot of time with irrelevant phones while there is not enough time left to analyze a phone which is worth examination. Forensic triage can help in this case: Such a triage is a preselection step based on a subset of data and is performed before fully extracting all the data from the smartphone. Triage can accelerate subsequent investigations and is especially useful in cases where time is essential. The aim of this paper is to determine which and how much data from an Android smartphone can be made directly accessible to the forensic investigator – without tedious investigations. For this purpose, an app has been developed that can be used with extremely limited storage of data in the handset and which outputs the extracted data immediately to the forensic workstation in a human- and machine-readable format.
Subtilisins from microbial sources, especially from the Bacillaceae family, are of particular interest for biotechnological applications and serve the currently growing enzyme market as efficient and novel biocatalysts. Biotechnological applications include use in detergents, cosmetics, leather processing, wastewater treatment and pharmaceuticals. To identify a possible candidate for the enzyme market, here we cloned the gene of the subtilisin SPFA from Fictibacillus arsenicus DSM 15822ᵀ (obtained through a data mining-based search) and expressed it in Bacillus subtilis DB104. After production and purification, the protease showed a molecular mass of 27.57 kDa and a pI of 5.8. SPFA displayed hydrolytic activity at a temperature optimum of 80 °C and a very broad pH optimum between 8.5 and 11.5, with high activity up to pH 12.5. SPFA displayed no NaCl dependence but a high NaCl tolerance, with decreasing activity up to concentrations of 5 m NaCl. The stability enhanced with increasing NaCl concentration. Based on its substrate preference for 10 synthetic peptide 4-nitroanilide substrates with three or four amino acids and its phylogenetic classification, SPFA can be assigned to the subgroup of true subtilisins. Moreover, SPFA exhibited high tolerance to 5% (w/v) SDS and 5% H₂O₂ (v/v). The biochemical properties of SPFA, especially its tolerance of remarkably high pH, SDS and H₂O₂, suggest it has potential for biotechnological applications.
The worldwide Corona pandemic has severely restricted student projects in the higher semesters of engineering courses. In order not to delay the graduation, a new concept had to be developed for projects under lockdown conditions. Therefore, unused rooms at the university should be digitally recorded in order to develop a new usage concept as laboratory rooms. An inventory of the actual state of the rooms was done first by taking photos and listing up all flaws and peculiarities. After that, a digital site measuring was done with a 360° laser scanner and these recorded scans were linked to a coherent point cloud and transferred to a software for planning technical building services and supporting Building Information Modelling (BIM). In order to better illustrate the difference between the actual and target state, two virtual reality models were created for realistic demonstration. During the project, the students had to go through the entire digital planning phases. Technical specifications had to be complied with, as well as documentation, time planning and cost estimate. This project turned out to be an excellent alternative to on-site practical training under lockdown conditions and increased the students’ motivation to deal with complex technical questions.
In the context of the Corona pandemic and its impact on teaching like digital lectures and exercises a new concept especially for freshmen in demanding courses of Smart Building Engineering became necessary. As there were hardly any face-to-face events at the university, the new teaching concept should enable a good start into engineering studies under pandemic conditions anyway and should also replace the written exam at the end. The students should become active themselves in small teams instead of listening passively to a lecture broadcast online with almost no personal contact. For this purpose, a role play was developed in which the freshmen had to work out a complete solution to the realistic problem of designing, construction planning and implementing a small guesthouse. Each student of the team had to take a certain role like architect, site manager, BIM-manager, electrician and the technitian for HVAC installations. Technical specifications must be complied with, as well as documentation, time planning and cost estimate. The final project folder had to contain technical documents like circuit diagrams for electrical components, circuit diagrams for water and heating, design calculations and components lists. On the other hand construction schedule, construction implementation plan, documentation of the construction progress and minutes of meetings between the various trades had to be submitted as well. In addition to the project folder, a model of the construction project must also be created either as a handmade model or as a digital 3D-model using Computer-aided design (CAD) software. The first steps in the field of Building information modelling (BIM) had also been taken by creating a digital model of the building showing the current planning status in real time as a digital twin. This project turned out to be an excellent training of important student competencies like teamwork, communication skills, and self -organisation and also increased motivation to work on complex technical questions. The aim of giving the student a first impression on the challenges and solutions in building projects with many different technical trades and their points of view was very well achieved and should be continued in the future.
Baukeramik
(1925)
Juli 2021. Durch starke Regenfälle von bis zu 250 l/m² innerhalb von 24h durch das Sturmtief «Bernd» war es dem bereits von vorherigen Niederschlägen übersättigten Boden nicht mehr möglich, weitere Wassermengen aufzunehmen. Die Folge sind schwerwiegende Überflutungen vor allem in Rheinland-Pfalz und NRW. Dass die Auswirkungen des Klimawandels bereits zu Hause angekommen sind, wird mit dieser Arbeit autobiografisch und illustrativ innerhalb eines Graphic Novels «Die Dinge danach – zum Hochwasser 21» be- und verarbeitet. Denn als das Wasser zu Hause wütete, war die Autorin selbst nicht anwesend. Dafür steckte sie die ersten Tage in Euskirchen fest, eine Stadt, die selbst Land unter war und zu einer Insel wurde. Ohne Strom, Netz und stark erschwertem Kontakt nach außen … und dabei sollte sie ursprünglich für eine Woche nur zwei Kater versorgen.
Elektronische Musik umgibt uns alle. Der Einfluss elektronischer Musikproduktion auf moderne Popularmusik ist nicht von der Hand zu weisen und ihre Produktionstechniken für heutige Standards unverzichtbar. Über die Jahre ihrer Entwicklung und den dabei entstandenen Musikrichtungen verschwanden die individuellen Charakteristika der verschiedenen Subgenres allerdings oft einfach unter der Sammelbezeichnung elektronischer Musik.
»Electrovisuals« stellt die auditiven Eigenschaften elektronischer Musik in Form von klaren Infografiken dar und sorgt damit für ein besseres Verständnis von Rhythmus und Struktur der Songs. Die verschiedenen Subgenres werden multimedial inszeniert und einander gegenübergestellt, um Ähnlichkeiten und Unterschiede zu betonen.
Mithilfe seiner Aufbereitung stellt das Projekt die hohe Diversität elektronischer Musik heraus und bereitet sie eindrucksvoll und greifbar auf.
Das KERAMION beherbergt ein breites Sortiment an historischen Keramikstücken wie auch modernen künstlerischen Keramikarbeiten. Bei der neuen Konzeptionierung des Museums wurde eine Verbindung der historischen Sammlung und der modernen Kunst geschaffen, als Visualisierung dafür steht die eigens für das Museum erstellte Schrift. In dieser Schrift wird eine historische, schwungvolle Kurrentschrift aus dem 16. Jahrhundert mit einer modernen kantigen Serifenschrift verbunden. Daraus entsteht eine Schrift, die die Breite der Keramik widerspiegelt und von Steingut bis Porzellan alle Facetten aufgreift. Gepaart wird die einzigartige Schrift mit einem auf das Zentrum ausgerichteten Layoutprinzip, bei dem 3D Scans der Exponate in den Fokus rücken. Somit können die Besucher*innen auch von zu Hause einen Vorgeschmack auf die einzigartigen Materialien und Keramiktechniken bekommen.
Von Zeichentisch und Letraset zu inhaltsbasierter Füllung und OpenType – wie sich die Werkzeuge des Grafikdesigns entwickelt und die Gestaltungsprozesse beeinflusst haben. Die Bachelorarbeit „Toolbar: Werkzeuge des Grafikdesigns“ setzt sich mit der eigenen Disziplin, dem Grafikdesign, auseinander und geht dabei seinen Wurzeln, den Werkzeugen, nach. Im Rahmen dessen werden in Gesprächen mit verschiedenen Gestalter*innen Tools und Technologien des Grafikdesigns untersucht und verglichen – angefangen vom analogen Paste-Up bis hin zu modernen Designmethoden. Dabei wird diskutiert, wie sich die Werkzeuge im Laufe der Zeit entwickelt haben und welche Auswirkungen dies auf das Grafikdesign und die Positionierung von Designer*innen hatte. Außerdem wird die Bedeutung von Werkzeugen im kreativen Prozess und ihre Auswirkungen auf die Gestaltung hinterfragt und aufgezeigt.
Das Konzerthaus Berlin ist einer der schönsten Orte Berlins für klassische Musik. Dabei steht das Haus auf der Schwelle zwischen langer Tradition und Moderne. Die Qualität an Musik, die dort gespielt wird, soll dabei für alle zugänglich sein. Dies machen die unterschiedlichsten Konzerte möglich. Um diesen modernen Anspruch auch nach außen zu kommunizieren, wurde ein neues Gestaltungskonzept für das Konzerthaus Berlin entwickelt. Im Fokus des neu entwickelten Corporate Designs steht der Rhythmus Gedanke, welcher sich im Headline Prinzip zeigt sowie medienübergreifend in der Layoutgestaltung durch Groß/Klein Kontraste fortsetzt. Das Erscheinungsbild funktioniert dabei sowohl animiert als auch statisch. Musik wird dadurch visuell erfahrbar gemacht und bildet den modernen Fokuspunkt in der Gestaltung.
Die Nutzung des Onlinehandels steigt seit Jahren kontinuierlich an. Für Händler:innen und ganz besonders für kleine Anbieter:innen, die zunehmend von größeren verdrängt werden, ist es wichtiger denn je, ihren Onlineauftritt auf den neuesten Stand zu bringen. Doch stellt dies die kleineren Anbieter:innen oftmals vor große Herausforderungen, denn der digitale Handel muss stets aktualisiert werden und kostet Zeit, Aufwand und Geld. „Localution“ ist eine digitale Plattform, die es lokalen Anbieter:innen möglich macht, mit wenig Aufwand eine Webseite aufzubauen. Zusätzlich sind Angebote buchbar, die den stationären Handel optimieren. Für Konsument:innen bietet „Localution“ die Möglichkeit, lokale Anbieter:innen in der Nähe zu finden und Termine zu buchen.
Selected problems in the field of multivariate statistical analysis are treated. Thereby, one focus is on the paired sample case. Among other things, statistical testing problems of marginal homogeneity are under consideration. In detail, properties of Hotelling‘s T² test in a special parametric situation are obtained. Moreover, the nonparametric problem of marginal homogeneity is discussed on the basis of possibly incomplete data. In the bivariate data case, properties of the Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic on the basis of partly not identically distributed data are investigated. Similar testing problems are treated within the scope of the application of a result for the empirical process of the concomitants for partly categorial data. Furthermore, testing changes in the modeled solvency capital requirement of an insurance company by means of a paired sample from an internal risk model is discussed. Beyond the paired sample case, a new asymptotic relative efficiency concept based on the expected volumes of multidimensional confidence regions is introduced. Besides, a new approach for the treatment of the multi-sample goodness-of-fit problem is presented. Finally, a consistent test for the treatment of the goodness-of-fit problem is developed for the background of huge or infinite dimensional data.
The steel prefabricated family house ›Quelle-Fertighaus‹ designed and constructed by the German company Quelle, is an innovative modular system commercialised in 1962. All aspects of the Quelle-Fertighaus are planned on the principle of minimal effort for maximal flexibility. The clever design of the ground plan based on a 4 m × 7 m module offers the flexibility for either one to three additional modules.
The steel construction is innovative and unique, consisting of load-bearing portal frames acting as braces. The house design is furthermore characterised by a simple metrical grid layout and the practical placement of the foundation and basement, which allowed the very cost-effective production and the lowest price for a prefabricated family house in Germany during the postwar era. Nowadays its portal-frame-construction offers an interesting approach for its renovation and transformation according to present building demands.
The present work aimed to study the mainstream feasibility of the deammonifying sludge of side stream of municipal wastewater treatment plant (MWWTP) in Kaster, Germany. For this purpose, the deammonifying sludge available at the side stream was investigated for nitrogen (N) removal with respect to the operational factors temperature (15–30°C), pH value (6.0–8.0) and chemical oxygen demand (COD)/N ratio (≤1.5–6.0). The highest and lowest N-removal rates of 0.13 and 0.045 kg/(m³ d) are achieved at 30 and 15°C, respectively. Different conditions of pH and COD/N ratios in the SBRs of Partial nitritation/anammox (PN/A) significantly influenced both the metabolic processes and associated N-removal rates. The scientific insights gained from the current work signifies the possibility of mainstream PN/A at WWTPs. The current study forms a solid basis of operational window for the upcoming semi-technical trails to be conducted prior to the full-scale mainstream PN/A at WWTP Kaster and WWTPs globally.
KNX is a protocol for smart building automation, e.g., for automated heating, air conditioning, or lighting. This paper analyses and evaluates state-of-the-art KNX devices from manufacturers Merten, Gira and Siemens with respect to security. On the one hand, it is investigated if publicly known vulnerabilities like insecure storage of passwords in software, unencrypted communication, or denialof-service attacks, can be reproduced in new devices. On the other hand, the security is analyzed in general, leading to the discovery of a previously unknown and high risk vulnerability related to so-called BCU (authentication) keys.
Nowadays, the most employed devices for recoding videos or capturing images are undoubtedly the smartphones. Our work investigates the application of source camera identification on mobile phones. We present a dataset entirely collected by mobile phones. The dataset contains both still images and videos collected by 67 different smartphones. Part of the images consists in photos of uniform backgrounds, especially collected for the computation of the RSPN. Identifying the source camera given a video is particularly challenging due to the strong video compression. The experiments reported in this paper, show the large variation in performance when testing an highly accurate technique on still images and videos.
Even the shortest flight through unknown, cluttered environments requires reliable local path planning algorithms to avoid unforeseen obstacles. The algorithm must evaluate alternative flight paths and identify the best path if an obstacle blocks its way. Commonly, weighted sums are used here. This work shows that weighted Chebyshev distances and factorial achievement scalarising functions are suitable alternatives to weighted sums if combined with the 3DVFH* local path planning algorithm. Both methods considerably reduce the failure probability of simulated flights in various environments. The standard 3DVFH* uses a weighted sum and has a failure probability of 50% in the test environments. A factorial achievement scalarising function, which minimises the worst combination of two out of four objective functions, reaches a failure probability of 26%; A weighted Chebyshev distance, which optimises the worst objective, has a failure probability of 30%. These results show promise for further enhancements and to support broader applicability.
Ice melting probes
(2023)
The exploration of icy environments in the solar system, such as the poles of Mars and the icy moons (a.k.a. ocean worlds), is a key aspect for understanding their astrobiological potential as well as for extraterrestrial resource inspection. On these worlds, ice melting probes are considered to be well suited for the robotic clean execution of such missions. In this chapter, we describe ice melting probes and their applications, the physics of ice melting and how the melting behavior can be modeled and simulated numerically, the challenges for ice melting, and the required key technologies to deal with those challenges. We also give an overview of existing ice melting probes and report some results and lessons learned from laboratory and field tests.
Lead and nickel, as heavy metals, are still used in industrial processes, and are classified as “environmental health hazards” due to their toxicity and polluting potential. The detection of heavy metals can prevent environmental pollution at toxic levels that are critical to human health. In this sense, the electrolyte–insulator–semiconductor (EIS) field-effect sensor is an attractive sensing platform concerning the fabrication of reusable and robust sensors to detect such substances. This study is aimed to fabricate a sensing unit on an EIS device based on Sn₃O₄ nanobelts embedded in a polyelectrolyte matrix of polyvinylpyrrolidone (PVP) and polyacrylic acid (PAA) using the layer-by-layer (LbL) technique. The EIS-Sn₃O₄ sensor exhibited enhanced electrochemical performance for detecting Pb²⁺ and Ni²⁺ ions, revealing a higher affinity for Pb²⁺ ions, with sensitivities of ca. 25.8 mV/decade and 2.4 mV/decade, respectively. Such results indicate that Sn₃O₄ nanobelts can contemplate a feasible proof-of-concept capacitive field-effect sensor for heavy metal detection, envisaging other future studies focusing on environmental monitoring.
The increasing share of renewable electricity in the grid drives the need for sufficient storage capacity. Especially for seasonal storage, power-to-gas can be a promising approach. Biologically produced methane from hydrogen produced from surplus electricity can be used to substitute natural gas in the existing infrastructure. Current reactor types are not or are poorly optimized for flexible methanation. Therefore, this work proposes a new reactor type with a plug flow reactor (PFR) design. Simulations in COMSOL Multiphysics ® showed promising properties for operation in laminar flow. An experiment was conducted to support the simulation results and to determine the gas fraction of the novel reactor, which was measured to be 29%. Based on these simulations and experimental results, the reactor was constructed as a 14 m long, 50 mm diameter tube with a meandering orientation. Data processing was established, and a step experiment was performed. In addition, a kLa of 1 h−1 was determined. The results revealed that the experimental outcomes of the type of flow and gas fractions are in line with the theoretical simulation. The new design shows promising properties for flexible methanation and will be tested.
Automated driving is now possible in diverse road and traffic conditions. However, there are still situations that automated vehicles cannot handle safely and efficiently. In this case, a Transition of Control (ToC) is necessary so that the driver takes control of the driving. Executing a ToC requires the driver to get full situation awareness of the driving environment. If the driver fails to get back the control in a limited time, a Minimum Risk Maneuver (MRM) is executed to bring the vehicle into a safe state (e.g., decelerating to full stop). The execution of ToCs requires some time and can cause traffic disruption and safety risks that increase if several vehicles execute ToCs/MRMs at similar times and in the same area. This study proposes to use novel C-ITS traffic management measures where the infrastructure exploits V2X communications to assist Connected and Automated Vehicles (CAVs) in the execution of ToCs. The infrastructure can suggest a spatial distribution of ToCs, and inform vehicles of the locations where they could execute a safe stop in case of MRM. This paper reports the first field operational tests that validate the feasibility and quantify the benefits of the proposed infrastructure-assisted ToC and MRM management. The paper also presents the CAV and roadside infrastructure prototypes implemented and used in the trials. The conducted field trials demonstrate that infrastructure-assisted traffic management solutions can reduce safety risks and traffic disruptions.
Modern implementations of driver assistance systems are evolving from a pure driver assistance to a independently acting automation system. Still these systems are not covering the full vehicle usage range, also called operational design domain, which require the human driver as fall-back mechanism. Transition of control and potential minimum risk manoeuvres are currently research topics and will bridge the gap until full autonomous vehicles are available. The authors showed in a demonstration that the transition of control mechanisms can be further improved by usage of communication technology. Receiving the incident type and position information by usage of standardised vehicle to everything (V2X) messages can improve the driver safety and comfort level. The connected and automated vehicle’s software framework can take this information to plan areas where the driver should take back control by initiating a transition of control which can be followed by a minimum risk manoeuvre in case of an unresponsive driver. This transition of control has been implemented in a test vehicle and was presented to the public during the IEEE IV2022 (IEEE Intelligent Vehicle Symposium) in Aachen, Germany.
Die Auswahl der passenden Geschäftsprozesse für eine Automatisierung mittels Robotic Process Automation (RPA) ist für den Erfolg von RPA-Projekten entscheidend. Das vorliegende Kapitel liefert dafür Selektionskriterien, die aus einer qualitativen Studie mit elf interviewten RPA-Experten aus dem Versicherungsumfeld resultieren. Das Ergebnis umfasst eine gewichtete Liste von sieben Dimensionen und 51 Prozesskriterien, welche die Automatisierung mit Softwarerobotern begünstigen beziehungsweise deren Nichterfüllung eine Umsetzung erschweren oder sogar verhindern. Die drei wichtigsten Kriterien zur Auswahl von Geschäftsprozessen für die Automatisierung mittels RPA umfassen die Entlastung der an dem Prozess mitwirkenden Mitarbeiter (Arbeitnehmerentlastung), die Ausführbarkeit des Prozesses mittels Regeln (Regelbasierte Prozessteuerung) sowie ein positiver Kosten-Nutzen-Vergleich. Auf diesen Ergebnissen aufbauend wird ein Vergleich mit den bereits bekannten Selektionskriterien aus der Literatur erstellt und diskutiert. Praktiker können die Ergebnisse verwenden, um eine systematische Auswahl von RPA-relevanten Prozessen vorzunehmen. Aus wissenschaftlicher Perspektive stellen die Ergebnisse eine Grundlage zur Erklärung des Erfolgs und Misserfolgs von RPA-Projekten dar.
A method for detecting and approximating fault lines or surfaces, respectively, or decision curves in two and three dimensions with guaranteed accuracy is presented. Reformulated as a classification problem, our method starts from a set of scattered points along with the corresponding classification algorithm to construct a representation of a decision curve by points with prescribed maximal distance to the true decision curve. Hereby, our algorithm ensures that the representing point set covers the decision curve in its entire extent and features local refinement based on the geometric properties of the decision curve. We demonstrate applications of our method to problems related to the detection of faults, to multi-criteria decision aid and, in combination with Kirsch’s factorization method, to solving an inverse acoustic scattering problem. In all applications we considered in this work, our method requires significantly less pointwise classifications than previously employed algorithms.
Deammonification for nitrogen removal in municipal wastewater in temperate and cold climate zones is currently limited to the side stream of municipal wastewater treatment plants (MWWTP). This study developed a conceptual model of a mainstream deammonification plant, designed for 30,000 P.E., considering possible solutions corresponding to the challenging mainstream conditions in Germany. In addition, the energy-saving potential, nitrogen elimination performance and construction-related costs of mainstream deammonification were compared to a conventional plant model, having a single-stage activated sludge process with upstream denitrification. The results revealed that an additional treatment step by combining chemical precipitation and ultra-fine screening is advantageous prior the mainstream deammonification. Hereby chemical oxygen demand (COD) can be reduced by 80% so that the COD:N ratio can be reduced from 12 to 2.5. Laboratory experiments testing mainstream conditions of temperature (8–20°C), pH (6–9) and COD:N ratio (1–6) showed an achievable volumetric nitrogen removal rate (VNRR) of at least 50 gN/(m3∙d) for various deammonifying sludges from side stream deammonification systems in the state of North Rhine-Westphalia, Germany, where m3 denotes reactor volume. Assuming a retained Norganic content of 0.0035 kgNorg./(P.E.∙d) from the daily loads of N at carbon removal stage and a VNRR of 50 gN/(m3∙d) under mainstream conditions, a resident-specific reactor volume of 0.115 m3/(P.E.) is required for mainstream deammonification. This is in the same order of magnitude as the conventional activated sludge process, i.e., 0.173 m3/(P.E.) for an MWWTP of size class of 4. The conventional plant model yielded a total specific electricity demand of 35 kWh/(P.E.∙a) for the operation of the whole MWWTP and an energy recovery potential of 15.8 kWh/(P.E.∙a) through anaerobic digestion. In contrast, the developed mainstream deammonification model plant would require only a 21.5 kWh/(P.E.∙a) energy demand and result in 24 kWh/(P.E.∙a) energy recovery potential, enabling the mainstream deammonification model plant to be self-sufficient. The retrofitting costs for the implementation of mainstream deammonification in existing conventional MWWTPs are nearly negligible as the existing units like activated sludge reactors, aerators and monitoring technology are reusable. However, the mainstream deammonification must meet the performance requirement of VNRR of about 50 gN/(m3∙d) in this case.
Industrie 4.0 stellt viele Herausforderungen an produzierende Unternehmen und ihre Beschäf-tigten. Innovative und effektive Trainingsstrategien sind erforderlich, um mit den sich schnell verändernden Produktionsumgebungen und neuen Fertigungstechnologien Schritt halten zu können. Virtual Reality (VR) bietet neue Möglichkeiten für On-the-Job, On-Demand- und Off-Premise-Schulungen. Diese Arbeit stellt ein neues VR Schulungssystem vor, welches sich flexible an unterschiedliche Trainingsobjekte auf Grundlage von Rezepten und CAD Modellen anpassen lässt. Das Konzept basiert auf gerichteten azyklischen Graphen und einem Level-system. Es ermöglicht eine benutzerindividuelle Lerngeschwindigkeit mittels visueller Ele-mente. Das Konzept wurde für einen mechanischen Anwendungsfall mit Industriekomponen-ten implementiert und in der Industrie 4.0-Modellfabrik der FH Aachen umgesetzt.
Digital twins are seen as one of the key technologies of Industry 4.0. Although many research groups focus on digital twins and create meaningful outputs, the technology has not yet reached a broad application in the industry. The main reasons for this imbalance are the complexity of the topic, the lack of specialists, and the unawareness of the twin opportunities. The project "Digital Twin Academy" aims to overcome these barriers by focusing on three actions: Building a digital twin community for discussion and exchange, offering multi-stage training for various knowledge levels, and implementing realworld use cases for deeper insights and guidance. In this work, we focus on creating a flexible learning platform that allows the user to select a training path adjusted to personal knowledge and needs. Therefore, a mix of basic and advanced modules is created and expanded by individual feedback options. The usage of personas supports the selection of the appropriate modules.
The development of protype applications with sensors and actuators in the automation industry requires tools that are independent of manufacturer, and are flexible enough to be modified or extended for any specific requirements. Currently, developing prototypes with industrial sensors and actuators is not straightforward. First of all, the exchange of information depends on the industrial protocol that these devices have. Second, a specific configuration and installation is done based on the hardware that is used, such as automation controllers or industrial gateways. This means that the development for a specific industrial protocol, highly depends on the hardware and the software that vendors provide. In this work we propose a rapid-prototyping framework based on Arduino to solve this problem. For this project we have focused to work with the IO-Link protocol. The framework consists of an Arduino shield that acts as the physical layer, and a software that implements the IO-Link Master protocol. The main advantage of such framework is that an application with industrial devices can be rapid-prototyped with ease as its vendor independent, open-source and can be ported easily to other Arduino compatible boards. In comparison, a typical approach requires proprietary hardware, is not easy to port to another system and is closed-source.
Background
Post-COVID-19 syndrome (PCS) is a lingering disease with ongoing symptoms such as fatigue and cognitive impairment resulting in a high impact on the daily life of patients. Understanding the pathophysiology of PCS is a public health priority, as it still poses a diagnostic and treatment challenge for physicians.
Methods
In this prospective observational cohort study, we analyzed the retinal microcirculation using Retinal Vessel Analysis (RVA) in a cohort of patients with PCS and compared it to an age- and gender-matched healthy cohort (n = 41, matched out of n = 204).
Measurements and main results
PCS patients exhibit persistent endothelial dysfunction (ED), as indicated by significantly lower venular flicker-induced dilation (vFID; 3.42% ± 1.77% vs. 4.64% ± 2.59%; p = 0.02), narrower central retinal artery equivalent (CRAE; 178.1 [167.5–190.2] vs. 189.1 [179.4–197.2], p = 0.01) and lower arteriolar-venular ratio (AVR; (0.84 [0.8–0.9] vs. 0.88 [0.8–0.9], p = 0.007). When combining AVR and vFID, predicted scores reached good ability to discriminate groups (area under the curve: 0.75). Higher PCS severity scores correlated with lower AVR (R = − 0.37 p = 0.017). The association of microvascular changes with PCS severity were amplified in PCS patients exhibiting higher levels of inflammatory parameters.
Conclusion
Our results demonstrate that prolonged endothelial dysfunction is a hallmark of PCS, and impairments of the microcirculation seem to explain ongoing symptoms in patients. As potential therapies for PCS emerge, RVA parameters may become relevant as clinical biomarkers for diagnosis and therapy management.
Preprint: Studies on the enzymatic reduction of levulinic acid using Chiralidon-R and Chiralidon-S
(2023)
The enzymatic reduction of levulinic acid by the chiral catalysts Chiralidon-R and Chiralidon-S which are commercially available superabsorbed alcohol dehydrogenases is described. The Chiralidon®-R/S reduces the levulinic acid to the (R,S)-4-hydroxy valeric acid and the (R)- or (S)- gamma-valerolactone.
Die Verfasser stellen in ihrem Beitrag die künftig in Kraft tretenden oder schon in Kraft getretenen Gesetzesvorhaben der europäischen Union vor. Vorab werde auf die abgelaufene Frist zur Anpassung von Standardvertragsklausel hingewiesen. Die Anpassung könne ggf. durch den Data Privacy Act der Kommission bewirkt werden, da dieser eine Angemessenheit suggeriere. Neben dem Digital Markets Act, der die Wahrung der Diskriminierungsfreiheit den Gatekeeper-Plattformen bezüglich der Bewerbung von Waren Dritter vorschreibt, sind ebenfalls der Digital Service Act und der Data Governance Act in Kraft getreten und werden künftig wirksam. Letzteres bezweckt den Datenaustausch von nicht-personenbezogenen Daten öffentlich-rechtlicher Datensätze, wobei anders als bei DSA, der die Verbraucherrechte durchsetzen möchte, mangels Verpflichtung die praktische Umsetzung ausbleiben werde. In der Entwurfsphase stecken der Artificial Intelligence Act, der Data Act, sowie der Cyber Resilience Act. Allen drei sei wegen dem weiten Anwendungsspielraum, der Bußgeldandrohung oder der Cyber-Bedrohungslage besondere praktische Relevanz beizumessen. Die Kommission weite durch diese Gesetzesvorhaben ihre Regelungsabsicht auch auf nicht-personenbezogene Daten und dem Datentransfer aus. Im Ergebnis werden die Unternehmen mit mehr Verpflichtungen konfrontiert, zu dessen Umsetzung ein funktionierendes Compliance-Management-System unabdingbar sei.
Datenschutz & Datenrecht – ein Ausblick auf 2023: Nationale Entwicklungen, EuGH-Vorlagen & Aufsicht
(2023)
Die Verfasser vermitteln einen Überblick über die nationalen Gesetzgebungsverfahren und wesentliche EuGH-Vorlagefragen betreffend den Datenschutz und das Datenrecht für das Jahr 2023. Zunächst folgen u.a. Hinweise in Bezug auf den Hinweisgeberschutz, die Verabschiedung der Einwilligungsverwaltungs-Verordnung zur Konkretisierung des § 26 TTDSG und das Mobilitätsdatengesetz. Anschließend werden Vorlagefragen deutscher Gerichte, die dem EuGH vorgelegt und bereits am 12.01.2023 beantwortet wurden, wie etwa C-154/21 und C-132/21 und die EuGH-Entscheidung vom 9.2.2023 (C-453/21), thematisiert. Überdies führen die Autoren wesentliche Entscheidungen des EuGH an, die im Jahr 2023 aus dem Bereich Datenrecht und Datenschutz zu erwarten seien. Auch Aktivitäten der Datenschutzaufsicht auf nationaler und europäischer Ebene finden Erwähnung. Die Verfasser machen abschließend auf besonders interessante Entscheidungen, die 2023 erwartet werden, wie etwa das EuGH-Urteil zum Auskunftsanspruch, sowie auf das Verhältnis des der Whistleblowing-RL umzusetzende Hinweisgeberschutzgesetz einerseits und Vorgaben des Datenschutzes andererseits, aufmerksam. Sie empfehlen, die künftige Rechtsprechung des EuGH im Blick zu behalten.
In dem vorliegenden Beitrag setzt sich der Verfasser mit dem Urteil des EuGH vom 4.5.2023 (Az.: C-60/22, DSB 2023, 178) zu den Auswirkungen eines formellen Verstoßes des Verantwortlichen gegen die Pflichten aus Artt. 26, 30 DSGVO (juris: EUV 2016/679) auf die Rechtmäßigkeit der Datenverarbeitung auseinander. Nachdem zunächst der zugrunde liegende Sachverhalt und der Hintergrund des Vorlageverfahrens skizziert wurden, gibt der Verfasser einen Überblick über die wesentlichen Entscheidungsgründe des EuGH. Insbesondere stelle der EuGH hier fest, dass die Rechtmäßigkeit der Verarbeitung in Art. 6 DSGVO geregelt sei und sich eine rechtswidrige Verarbeitung daher nur aus einem Verstoß gegen die Artt. 6 ff. DSGVO ergeben könne; die Pflichten aus Art. 26 und Art. 30 DSGVO würden nicht zu den Gründen für die Rechtmäßigkeit der Verarbeitung zählen. Mit Blick auf die Praxis lasse sich, so der Verfasser abschließend, festhalten, dass die Entscheidung insofern nicht überraschend sei; jedoch sei die Feststellung, dass sich aus Verstößen gegen Art. 26 und Art. 30 DSGVO kein Verstoß gegen das Grundrecht auf den Schutz personenbezogener Daten nachweisen lasse überraschend und bedenklich. Auch überrasche es, dass der EuGH eher in einem Nebensatz feststelle, dass der Verantwortliche im Prozess aufgrund seiner Rechenschaftspflicht gegenüber Betroffenen beweisbelastet ist; ob sich die Kammer hier der möglichen Auswirkungen ihrer Ausführungen bewusst gewesen sei, bleibe fraglich.
Umsatzbasierte Bußgelder – wie sonst nur aus dem Kartellrecht bekannt – waren einer der Gründe, warum die Datenschutz-Grundverordnung (DSGVO) vor ihrem Inkrafttreten für erhebliches Aufsehen sorgte. Die vielfach relevanteren Schadensersatzansprüche, die, wie bei „Dieselgate“, aufgrund der Vielzahl von betroffenen Personen und der aus Sicht von Rechtsdienstleistern bestehenden Skalierbarkeit mit weitaus höheren Einbußen für Unternehmen einhergehen können, blieben zunächst unbeachtet. Inzwischen ist der Schadensersatzanspruch gem. Art. 82 DSGVO die Vorschrift, die die meisten Vorlagen zum Europäischen Gerichtshof (EuGH) der letzten Jahre hervorgerufen hat. Am 4.5.2023 hat nun der EuGH (Urteil v. 4.5.2023 - Rs. C-300/21, NWB GAAAJ-41389) in einem Grundsatzurteil über zentrale Fragen rund um den Ersatz immaterieller Schäden als Folge von Datenschutzverstößen entschieden.
In recent years, the development of large pretrained language models, such as BERT and GPT, significantly improved information extraction systems on various tasks, including relation classification. State-of-the-art systems are highly accurate on scientific benchmarks. A lack of explainability is currently a complicating factor in many real-world applications. Comprehensible systems are necessary to prevent biased, counterintuitive, or harmful decisions.
We introduce semantic extents, a concept to analyze decision patterns for the relation classification task. Semantic extents are the most influential parts of texts concerning classification decisions. Our definition allows similar procedures to determine semantic extents for humans and models. We provide an annotation tool and a software framework to determine semantic extents for humans and models conveniently and reproducibly. Comparing both reveals that models tend to learn shortcut patterns from data. These patterns are hard to detect with current interpretability methods, such as input reductions. Our approach can help detect and eliminate spurious decision patterns during model development. Semantic extents can increase the reliability and security of natural language processing systems. Semantic extents are an essential step in enabling applications in critical areas like healthcare or finance. Moreover, our work opens new research directions for developing methods to explain deep learning models.
Extracting workflow nets from textual descriptions can be used to simplify guidelines or formalize textual descriptions of formal processes like business processes and algorithms. The task of manually extracting processes, however, requires domain expertise and effort. While automatic process model extraction is desirable, annotating texts with formalized process models is expensive. Therefore, there are only a few machine-learning-based extraction approaches. Rule-based approaches, in turn, require domain specificity to work well and can rarely distinguish relevant and irrelevant information in textual descriptions. In this paper, we present GUIDO, a hybrid approach to the process model extraction task that first, classifies sentences regarding their relevance to the process model, using a BERT-based sentence classifier, and second, extracts a process model from the sentences classified as relevant, using dependency parsing. The presented approach achieves significantly better resul ts than a pure rule-based approach. GUIDO achieves an average behavioral similarity score of 0.93. Still, in comparison to purely machine-learning-based approaches, the annotation costs stay low.
Providing healthcare services frequently involves cognitively demanding tasks, including diagnoses and analyses as well as complex decisions about treatments and therapy. From a global perspective, ethically significant inequalities exist between regions where the expert knowledge required for these tasks is scarce or abundant. One possible strategy to diminish such inequalities and increase healthcare opportunities in expert-scarce settings is to provide healthcare solutions involving digital technologies that do not necessarily require the presence of a human expert, e.g., in the form of artificial intelligent decision-support systems (AI-DSS). Such algorithmic decision-making, however, is mostly developed in resource- and expert-abundant settings to support healthcare experts in their work. As a practical consequence, the normative standards and requirements for such algorithmic decision-making in healthcare require the technology to be at least as explainable as the decisions made by the experts themselves. The goal of providing healthcare in settings where resources and expertise are scarce might come with a normative pull to lower the normative standards of using digital technologies in order to provide at least some healthcare in the first place. We scrutinize this tendency to lower standards in particular settings from a normative perspective, distinguish between different types of absolute and relative, local and global standards of explainability, and conclude by defending an ambitious and practicable standard of local relative explainability.
The work in modern open-pit and underground mines requires the transportation of large amounts of resources between fixed points. The navigation to these fixed points is a repetitive task that can be automated. The challenge in automating the navigation of vehicles commonly used in mines is the systemic properties of such vehicles. Many mining vehicles, such as the one we have used in the research for this paper, use steering systems with an articulated joint bending the vehicle’s drive axis to change its course and a hydraulic drive system to actuate axial drive components or the movements of tippers if available. To address the difficulties of controlling such a vehicle, we present a model-predictive approach for controlling the vehicle. While the control optimisation based on a parallel error minimisation of the predicted state has already been established in the past, we provide insight into the design and implementation of an MPC for an articulated mining vehicle and show the results of real-world experiments in an open-pit mine environment.
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.
Messenger apps like WhatsApp and Telegram are frequently used for everyday communication, but they can also be utilized as a platform for illegal activity. Telegram allows public groups with up to 200.000 participants. Criminals use these public groups for trading illegal commodities and services, which becomes a concern for law enforcement agencies, who manually monitor suspicious activity in these chat rooms. This research demonstrates how natural language processing (NLP) can assist in analyzing these chat rooms, providing an explorative overview of the domain and facilitating purposeful analyses of user behavior. We provide a publicly available corpus of annotated text messages with entities and relations from four self-proclaimed black market chat rooms. Our pipeline approach aggregates the extracted product attributes from user messages to profiles and uses these with their sold products as features for clustering. The extracted structured information is the foundation for further data exploration, such as identifying the top vendors or fine-granular price analyses. Our evaluation shows that pretrained word vectors perform better for unsupervised clustering than state-of-the-art transformer models, while the latter is still superior for sequence labeling.
Werkstatt Zukunft - Die FH Aachen versteht sich als interdisziplinäre Ideenschmiede
04| Aufbruch ins Ungewisse
08| Mit Altpapierresten aus der Energiekrise
14| Mehr Power für die Energiewende
20| Pfadfinder im Datendschungel
23| Ein historischer Schritt
24| Gemeinsam vorwärts
30| Für eine gerechte Hochschule. Für alle.
32| Der grüne Reiter
34| Vom Flugsimulator bis zum Roboter
36| Unsere Azubis sind top!
38| WIR sind ein Sportteam!
40| Effizient und sicher: Der Bergbau der Zukunft
46| Mit Sonnenlicht zu sauberem Trinkwasser
48| Neue Brandmeldeanlage für den Dom zu Aachen
50| Auszeichnung für außergewöhnliches Engagement
52| Neue Ideen für Schloss Lichtenburg
55| Kopfnuss
56| One-Man-Show
60| Kämpferin für das Schöne
61| Ein charmanter Diplomat
62| Teamplay für die Zukunft
63| Impressum