Refine
Year of publication
- 2021 (293) (remove)
Institute
- Fachbereich Gestaltung (94)
- Fachbereich Medizintechnik und Technomathematik (56)
- IfB - Institut für Bioengineering (36)
- Fachbereich Elektrotechnik und Informationstechnik (27)
- Fachbereich Luft- und Raumfahrttechnik (24)
- Fachbereich Wirtschaftswissenschaften (24)
- Fachbereich Energietechnik (23)
- Fachbereich Bauingenieurwesen (15)
- INB - Institut für Nano- und Biotechnologien (15)
- Fachbereich Chemie und Biotechnologie (12)
Document Type
- Article (109)
- Bachelor Thesis (89)
- Conference Proceeding (52)
- Part of a Book (17)
- Book (9)
- Master's Thesis (4)
- Report (4)
- Doctoral Thesis (2)
- Review (2)
- Conference: Meeting Abstract (1)
Keywords
- Corporate Design (6)
- Animation (5)
- Fotografie (4)
- Illustration (4)
- Nachhaltigkeit (4)
- UX Design (4)
- App (3)
- Botanik (3)
- Dokumentation (3)
- Gamification (3)
Zugriffsart
- weltweit (119)
- campus (67)
- bezahl (34)
- fachbereichsweit (FB4) (1)
Humic substances (HS), as important environmental components, are essential to soil health and agricultural sustainability. The usage of low-rank coal (LRC) for energy generation has declined considerably due to the growing popularity of renewable energy sources and gas. However, their potential as soil amendment aimed to maintain soil quality and productivity deserves more recognition. LRC, a highly heterogeneous material in nature, contains large quantities of HS and may effectively help to restore the physicochemical, biological, and ecological functionality of soil. Multiple emerging studies support the view that LRC and its derivatives can positively impact the soil microclimate, nutrient status, and organic matter turnover. Moreover, the phytotoxic effects of some pollutants can be reduced by subsequent LRC application. Broad geographical availability, relatively low cost, and good technical applicability of LRC offer the advantage of easy fulfilling soil amendment and conditioner requirements worldwide. This review analyzes and emphasizes the potential of LRC and its numerous forms/combinations for soil amelioration and crop production. A great benefit would be a systematic investment strategy implicating safe utilization and long-term application of LRC for sustainable agricultural production.
The coupling of ligand-stabilized gold nanoparticles with field-effect devices offers new possibilities for label-free biosensing. In this work, we study the immobilization of aminooctanethiol-stabilized gold nanoparticles (AuAOTs) on the silicon dioxide surface of a capacitive field-effect sensor. The terminal amino group of the AuAOT is well suited for the functionalization with biomolecules. The attachment of the positively-charged AuAOTs on a capacitive field-effect sensor was detected by direct electrical readout using capacitance-voltage and constant capacitance measurements. With a higher particle density on the sensor surface, the measured signal change was correspondingly more pronounced. The results demonstrate the ability of capacitive field-effect sensors for the non-destructive quantitative validation of nanoparticle immobilization. In addition, the electrostatic binding of the polyanion polystyrene sulfonate to the AuAOT-modified sensor surface was studied as a model system for the label-free detection of charged macromolecules. Most likely, this approach can be transferred to the label-free detection of other charged molecules such as enzymes or antibodies.
Erdbebennachweis von Mauerwerksbauten mit realistischen Modellen und erhöhten Verhaltensbeiwerten
(2021)
Die Anwendung des linearen Nachweiskonzepts auf Mauerwerksbauten führt dazu, dass bereits heute Standsicherheitsnachweise für Gebäude mit üblichen Grundrissen in Gebieten mit moderaten Erdbebeneinwirkungen nicht mehr geführt werden können. Diese Problematik wird sich in Deutschland mit der Einführung kontinuierlicher probabilistischer Erdbebenkarten weiter verschärfen. Aufgrund der Erhöhung der seismischen Einwirkungen, die sich vielerorts ergibt, ist es erforderlich, die vorhandenen, bislang nicht berücksichtigten Tragfähigkeitsreserven in nachvollziehbaren Nachweiskonzepten in der Baupraxis verfügbar zu machen. Der vorliegende Beitrag stellt ein Konzept für die gebäudespezifische Ermittlung von erhöhten Verhaltensbeiwerten vor. Die Verhaltensbeiwerte setzen sich aus drei Anteilen zusammen, mit denen die Lastumverteilung im Grundriss, die Verformungsfähigkeit und Energiedissipation sowie die Überfestigkeiten berücksichtigt werden. Für die rechnerische Ermittlung dieser drei Anteile wird ein nichtlineares Nachweiskonzept auf Grundlage von Pushover-Analysen vorgeschlagen, in denen die Interaktionen von Wänden und Geschossdecken durch einen Einspanngrad beschrieben werden. Für die Bestimmung der Einspanngrade wird ein nichtlinearer Modellierungsansatz eingeführt, mit dem die Interaktion von Wänden und Decken abgebildet werden kann. Die Anwendung des Konzepts mit erhöhten gebäudespezifischen Verhaltensbeiwerten wird am Beispiel eines Mehrfamilienhauses aus Kalksandsteinen demonstriert. Die Ergebnisse der linearen Nachweise mit erhöhten Verhaltensbeiwerten für dieses Gebäude liegen deutlich näher an den Ergebnissen nichtlinearer Nachweise und somit bleiben übliche Grundrisse in Erdbebengebieten mit den traditionellen linearen Rechenansätzen nachweisbar.
Mauerwerksbauten in Deutschland sind mit Einführung des nationalen Anwendungsdokuments DIN EN 1998-1/NA auf Grundlage einer neuen probabilistischen Erdbebenkarte nachzuweisen. Für erfolgreiche Erdbebennachweise üblicher Grundrissformen von Mauerwerksbauten stehen in dem zukünftigen Anwendungsdokument neue rechnerische Nachweismöglichkeiten zur Verfügung, mit denen die Tragfähigkeitsreserven von Mauerwerksbauten in der Baupraxis mit einem überschaubaren Aufwand besser in Ansatz gebracht werden können. Das Standardrechenverfahren ist weiterhin der kraftbasierte Nachweis, der nun mit höheren Verhaltensbeiwerten im Vergleich zur DIN 4149 durchgeführt werden kann. Die höheren Verhaltensbeiwerte basieren auf der besseren Ausnutzung der gebäudespezifischen Verformungsfähigkeit und Energiedissipation sowie der Lastumverteilung der Schubkräfte im Grundriss mit Ansatz von Rahmentragwirkung durch Wand-Deckeninteraktionen. Alternativ dazu kann ein nichtlinearer Nachweis auf Grundlage von Pushover-Analysen zur Anwendung kommen. Vervollständigt werden die Regelungen für Mauerwerksbauten durch neue Regelungen für nichttragende Innenwände und Außenmauerschalen. Der vorliegende Beitrag stellt die Grundlagen und Hintergründe der neuen rechnerischen Nachweise in DIN EN 1998-1/NA vor und demonstriert deren Anwendung an einem Beispiel aus der Praxis.
Past earthquakes demonstrated the high vulnerability of industrial facilities equipped with complex process technologies leading to serious damage of process equipment and multiple and simultaneous release of hazardous substances. Nonetheless, current standards for seismic design of industrial facilities are considered inadequate to guarantee proper safety conditions against exceptional events entailing loss of containment and related consequences. On these premises, the SPIF project -Seismic Performance of Multi-Component Systems in Special Risk Industrial Facilities- was proposed within the framework of the European H2020 SERA funding scheme. In detail, the objective of the SPIF project is the investigation of the seismic behaviour of a representative industrial multi-storey frame structure equipped with complex process components by means of shaking table tests. Along this main vein and in a performance-based design perspective, the issues investigated in depth are the interaction between a primary moment resisting frame (MRF) steel structure and secondary process components that influence the performance of the whole system; and a proper check of floor spectra predictions. The evaluation of experimental data clearly shows a favourable performance of the MRF structure, some weaknesses of local details due to the interaction between floor crossbeams and process components and, finally, the overconservatism of current design standards w.r.t. floor spectra predictions.
In the context of the Solvency II directive, the operation of an internal risk model is a possible way for risk assessment and for the determination of the solvency capital requirement of an insurance company in the European Union. A Monte Carlo procedure is customary to generate a model output. To be compliant with the directive, validation of the internal risk model is conducted on the basis of the model output. For this purpose, we suggest a new test for checking whether there is a significant change in the modeled solvency capital requirement. Asymptotic properties of the test statistic are investigated and a bootstrap approximation is justified. A simulation study investigates the performance of the test in the finite sample case and confirms the theoretical results. The internal risk model and the application of the test is illustrated in a simplified example. The method has more general usage for inference of a broad class of law-invariant and coherent risk measures on the basis of a paired sample.
Wir stellen hier exemplarisch STACK Aufgaben vor, die frei von der Problematik sind, welche sich durch diverse Kommunikationswege und (webbasierte) Computer Algebra Systeme (CAS) ergibt. Daher sind sie insbesondere für eine Open-Book Online Prüfung geeignet, da eine faire Prüfungssituation gewährleistet werden kann.
Einfluss von Künstlicher Intelligenz auf Customer Journeys am Beispiel von intelligentem Parken
(2021)
Im Konsumentenmarkt entstehen vermehrt neue Anwendungen von Künstlicher
Intelligenz (KI). Zunehmend drängen auch Geräte und Dienste in den Markt, die
eigenständig über das Internet kommunizieren. Dadurch können diese Geräte und
Dienste mit neuartigen KI-basierten Diensten verbessert werden. Solche Dienste
können die Art und Weise beeinflussen, wie Kunden kommerzielle Entscheidungen
treffen und somit das Kundenerlebnis maßgeblich verändern. Der Einfluss von KI
auf kommerzielle Interaktionen wurde bisher noch nicht umfassend untersucht.
Basierend auf einem Framework, welches einen ersten Überblick über die Effekte
von KI auf kommerzielle Interaktionen gibt, wird in diesem Kapitel der Einfluss von KI auf Customer Journeys am konkreten Anwendungsfall des intelligenten Parkens analysiert. Die daraus gewonnenen Erkenntnisse können in der Praxis als Grundlage
genutzt werden, um das Potenzial von KI zu verstehen und bei der Gestaltung eigener Customer Journeys umzusetzen.
Intelligent autonomous software robots replacing human activities and performing administrative processes are reality in today’s corporate world. This includes, for example, decisions about invoice payments, identification of customers for a marketing campaign, and answering customer complaints. What happens if such a software robot causes a damage? Due to the complete absence of human activities, the question is not trivial. It could even happen that no one is liable for a damage towards a third party, which could create an uncalculatable legal risk for business partners. Furthermore, the implementation and operation of those software robots involves various stakeholders, which result in the unsolvable endeavor of identifying the originator of a damage. Overall it is advisable to all involved parties to carefully consider the legal situation. This chapter discusses the liability of software robots from an interdisciplinary perspective. Based on different technical scenarios the legal aspects of liability are discussed.
The benefits of robotic process automation (RPA) are highly related to the usage of commercial off-the-shelf (COTS) software products that can be easily implemented and customized by business units. But, how to find the best fitting RPA product for a specific situation that creates the expected benefits? This question is related to the general area of software evaluation and selection. In the face of more than 75 RPA products currently on the market, guidance considering those specifics is required. Therefore, this chapter proposes a criteria-based selection method specifically for RPA. The method includes a quantitative evaluation of costs and benefits as well as a qualitative utility analysis based on functional criteria. By using the visualization of financial implications (VOFI) method, an application-oriented structure is provided that opposes the total cost of ownership to the time savings times salary (TSTS). For the utility analysis a detailed list of functional criteria for RPA is offered. The whole method is based on a multi-vocal review of scientific and non-scholarly literature including publications by business practitioners, consultants, and vendors. The application of the method is illustrated by a concrete RPA example. The illustrated
structures, templates, and criteria can be directly utilized by practitioners in their real-life RPA implementations. In addition, a normative decision process for selecting RPA alternatives is proposed before the chapter closes with a discussion and outlook.
Robotic process automation (RPA) has attracted increasing attention in research and practice. This chapter positions, structures, and frames the topic as an introduction to this book. RPA is understood as a broad concept that comprises a variety of concrete solutions. From a management perspective RPA offers an innovative approach for realizing automation potentials, whereas from a technical perspective the implementation based on software products and the impact of artificial intelligence (AI) and machine learning (ML) are relevant. RPA is industry-independent and can be used, for example, in finance, telecommunications, and the public sector. With respect to RPA this chapter discusses definitions, related approaches, a structuring framework, a research framework, and an inside as well as outside architectural view. Furthermore, it provides an overview of the book combined with short summaries of each chapter.
Subject of this case is Deutsche Telekom Services Europe (DTSE), a service center for administrative processes. Due to the high volume of repetitive tasks (e.g., 100k manual uploads of offer documents into SAP per year), automation was identified as an important strategic target with a high management attention and commitment. DTSE has to work with various backend application systems without any possibility to change those systems. Furthermore, the complexity of administrative processes differed. When it comes to the transfer of unstructured data (e.g., offer documents) to structured data (e.g., MS Excel files), further cognitive technologies were needed.
Dimensionen 1-2021: Magazin der FH Aachen University of Applied Sciences - 50 Jahre FH Aachen
(2021)
04| Adieda & Welkomme
06| Das WIR wird großgeschrieben
08| Hoch aus dem Norden, da komm ich her!
12| „Ich möchte ein Heimatgefühl erzeugen“
14| Das neue Rektorat – persönlich und privat
18| „Wir müssen uns einen Kompass geben“
20| Keime im Wasser
22| „Ist mitgemeint auch wirklich mitgedacht?“
24| Wachs für den Weltraum
28| Auslandssemester trotz Pandemie
30| Virtuelles Reinschnuppern
31| Top-Platzierungen für die FH
32| Luftstrom
36| Gründen will gelehrt sein
38| „Lebende Plastikkugel“
40| Pioniere des 21. Jahrhunderts
43| Wir bleiben in Kontakt
44| Auszeit
46| Hand in Hand ins All
48| Kampf gegen tödliche Infektionen
50| Ein Bau für den Holzbau
52| Wissen ist Silber. Machen ist Gold.
60| Honorarprofessur für Dr. Roger Uhle
61| Faktoren ohne Null
The molecular weight properties of lignins are one of the key elements that need to be analyzed for a successful industrial application of these promising biopolymers. In this study, the use of 1H NMR as well as diffusion-ordered spectroscopy (DOSY NMR), combined with multivariate regression methods, was investigated for the determination of the molecular weight (Mw and Mn) and the polydispersity of organosolv lignins (n = 53, Miscanthus x giganteus, Paulownia tomentosa, and Silphium perfoliatum). The suitability of the models was demonstrated by cross validation (CV) as well as by an independent validation set of samples from different biomass origins (beech wood and wheat straw). CV errors of ca. 7–9 and 14–16% were achieved for all parameters with the models from the 1H NMR spectra and the DOSY NMR data, respectively. The prediction errors for the validation samples were in a similar range for the partial least squares model from the 1H NMR data and for a multiple linear regression using the DOSY NMR data. The results indicate the usefulness of NMR measurements combined with multivariate regression methods as a potential alternative to more time-consuming methods such as gel permeation chromatography.
In this study, a recently proposed NMR standardization approach by 2H integral of deuterated solvent for quantitative multicomponent analysis of complex mixtures is presented. As a proof of principle, the existing NMR routine for the analysis of Aloe vera products was modified. Instead of using absolute integrals of targeted compounds and internal standard (nicotinamide) from 1H-NMR spectra, quantification was performed based on the ratio of a particular 1H-NMR compound integral and 2H-NMR signal of deuterated solvent D2O. Validation characteristics (linearity, repeatability, accuracy) were evaluated and the results showed that the method has the same precision as internal standardization in case of multicomponent screening. Moreover, a dehydration process by freeze drying is not necessary for the new routine. Now, our NMR profiling of A. vera products needs only limited sample preparation and data processing. The new standardization methodology provides an appealing alternative for multicomponent NMR screening. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and is recommended in different application areas (purity determination, forensics, pharmaceutical analysis, etc.).
The investigation of the possibility to determine various characteristics of powder heparin (n = 115) was carried out with infrared spectroscopy. The evaluation of heparin samples included several parameters such as purity grade, distributing company, animal source as well as heparin species (i.e. Na-heparin, Ca-heparin, and heparinoids). Multivariate analysis using principal component analysis (PCA), soft independent modelling of class analogy (SIMCA), and partial least squares – discriminant analysis (PLS-DA) were applied for the modelling of spectral data. Different pre-processing methods were applied to IR spectral data; multiplicative scatter correction (MSC) was chosen as the most relevant.
Obtained results were confirmed by nuclear magnetic resonance (NMR) spectroscopy. Good predictive ability of this approach demonstrates the potential of IR spectroscopy and chemometrics for screening of heparin quality. This approach, however, is designed as a screening tool and is not considered as a replacement for either of the methods required by USP and FDA.
Quantitative nuclear magnetic resonance (qNMR) is routinely performed by the internal or external standardization. The manuscript describes a simple alternative to these common workflows by using NMR signal of another active nuclei of calibration compound. For example, for any arbitrary compound quantification by NMR can be based on the use of an indirect concentration referencing that relies on a solvent having both 1H and 2H signals. To perform high-quality quantification, the deuteration level of the utilized deuterated solvent has to be estimated.
In this contribution the new method was applied to the determination of deuteration levels in different deuterated solvents (MeOD, ACN, CDCl3, acetone, benzene, DMSO-d6). Isopropanol-d6, which contains a defined number of deuterons and protons, was used for standardization. Validation characteristics (precision, accuracy, robustness) were calculated and the results showed that the method can be used in routine practice. Uncertainty budget was also evaluated. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and can be applied in different application areas (purity determination, forensics, pharmaceutical analysis, etc.).
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through a simple emulation of user input and output by software robots. Hence, it can be assumed that no changes of the used software systems and existing Enterprise Architecture (EA) is
required. In this short, practical paper we discuss this assumption based on a real-life implementation project. We show that a successful RPA implementation might require architectural work during analysis, implementation, and migration. As practical paper we focus on exemplary lessons-learned and new questions related to RPA and EA.
Digital Shadows as the aggregation, linkage and abstraction of data relating to physical objects are a central vision for the future of production. However, the majority of current research takes a technocentric approach, in which the human actors in production play a minor role. Here, the authors present an alternative anthropocentric perspective that highlights the potential and main challenges of extending the concept of Digital Shadows to humans. Following future research methodology, three prospections that illustrate use cases for Human Digital Shadows across organizational and hierarchical levels are developed: human-robot collaboration for manual work, decision support and work organization, as well as human resource management. Potentials and challenges are identified using separate SWOT analyses for the three prospections and common themes are emphasized in a concluding discussion.