Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1926)
- Fachbereich Elektrotechnik und Informationstechnik (1149)
- Fachbereich Wirtschaftswissenschaften (1119)
- Fachbereich Energietechnik (1066)
- Fachbereich Chemie und Biotechnologie (892)
- Fachbereich Maschinenbau und Mechatronik (800)
- Fachbereich Luft- und Raumfahrttechnik (768)
- Fachbereich Bauingenieurwesen (664)
- IfB - Institut für Bioengineering (625)
- INB - Institut für Nano- und Biotechnologien (585)
- Fachbereich Gestaltung (343)
- Solar-Institut Jülich (335)
- Fachbereich Architektur (163)
- ECSM European Center for Sustainable Mobility (113)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (66)
- Nowum-Energy (65)
- ZHQ - Bereich Hochschuldidaktik und Evaluation (62)
- Institut fuer Angewandte Polymerchemie (32)
- Sonstiges (24)
- IBB - Institut für Baustoffe und Baukonstruktionen (21)
- Freshman Institute (19)
- Kommission für Forschung und Entwicklung (19)
- Verwaltung (11)
- Arbeitsstelle fuer Hochschuldidaktik und Studienberatung (4)
- FH Aachen (4)
- IaAM - Institut für angewandte Automation und Mechatronik (4)
- IMP - Institut für Mikrowellen- und Plasmatechnik (3)
- Kommission für Planung und Finanzen (2)
- Datenverarbeitungszentrale (1)
- Digitalisierung in Studium & Lehre (1)
- Senat (1)
Has Fulltext
- no (9276) (remove)
Language
Document Type
- Article (5514)
- Conference Proceeding (1413)
- Book (1057)
- Part of a Book (555)
- Patent (174)
- Bachelor Thesis (165)
- Report (82)
- Doctoral Thesis (79)
- Conference: Meeting Abstract (75)
- Other (67)
Keywords
- Illustration (10)
- Nachhaltigkeit (10)
- Corporate Design (9)
- Erscheinungsbild (8)
- Gamification (8)
- Redesign (7)
- Animation (6)
- Datenschutz (6)
- Deutschland (6)
- Digitalisierung (6)
Die qualitative und quantitative Detektion von Zielsubstanzen innerhalb einer wässrigen Probe ist für viele Fragestellungen von Interesse, etwa bei der Detektion von Kontaminationen in Trinkwasser in Krisensituationen. Hierbei ist es nicht nur wichtig, dass Pathogene möglichst sensitiv detektiert werden können, sondern auch, dass die Analyse schnell erfolgt, um Betroffenen im Katastrophenfall zügig sicheres Trinkwasser zu Verfügung stellen zu können. Da bei einem solchen Szenario nicht von einer in der Nähe befindlichen funktionierenden Laborinfrastruktur ausgegangen werden kann, ist es wichtig, dass die Messung direkt vor Ort erfolgen kann. Im Rahmen dieser Arbeit wurde untersucht, ob eine derartige Schnellanalytik mithilfe von superparamagnetischen Beads (MBs) und der magnetischen Frequenzmischtechnik möglich ist. Dabei werden die MBs mit Hilfe von primären Antikörpern an die Zielsubstanz gebunden und mit sekundären Antikörpern an die Poren-Oberfläche eines Polyethylen-Filters fixiert (Sandwich-Immunoassay). So kann die Quantifizierung der Zielsubstanz auf eine magnetische Messung der immobilisierten MB-Marker zurückgeführt werden. Die magnetische Frequenzmischtechnik basiert auf der Anregung der Probe mit Magnetfeldern zweier verschiedener Frequenzen. Die durch die nichtlineare Magnetisierungsform der superparamagnetischen MBs entstehenden Mischfrequenzen werden typischerweise mithilfe einer zweistufigen Lock-in-Detektion analysiert (analoge Demodulation), die in einem Magnetreader als Handheldgerät realisiert wurde. Zusätzlich zu dieser Technik wurde das Prinzip der direkten Digitalisierung des gesamten Antwortsignals mit anschließender Fourier-Analyse der erzeugten Mischfrequenzen experimentell umgesetzt, um die Amplituden und Phasen mehrerer Mischfrequenzen simultan zu erfassen. Eine Möglichkeit zur Sensitivitätssteigerung ist die magnetische Aufkonzentration, indem vor der magnetischen Analyse eine Separation der MBs aus einem größeren Probenvolumen mittels magnetischem Feldgradienten durchgeführt wird. Zur Charakterisierung verschiedener kommerzieller MBs hinsichtlich ihrer magnetischen Separierbarkeit wurde ein Aufbau zur Messung ihrer magnetophoretischen Beweglichkeiten realisiert und ihre Geschwindigkeiten im Gradientenfeld mikroskopisch gemessen.Da eine Probe oftmals nicht nur auf eine einzige Zielsubstanz, sondern simultan auf mehrere verschiedene Pathogene hin untersucht werden soll, wurden verschiedene Ansätze entwickelt und getestet, die einen solchen multiparametrischen magnetischen Immunoassay ermöglichen. Einerseits wurde eine räumliche Separation der Bindungsbereiche für verschiedene Zielsubstanzen realisiert, die sequentiell ausgewertet werden können. Andererseits wurde die Unterscheidung von verschiedenen Zielsubstanzen anhand der Charakteristika der an sie gebundenen, verschieden funktionalisierten MB-Typen untersucht. Für eine solche Unterscheidung wurde zum einen die Anregefrequenz der magnetischen Frequenzmischtechnik während einer Messung variiert. Damit konnte gezeigt werden, dass sich verschiedene MB-Sorten anhand der Phase ihrer Frequenzmischsignale voneinander unterscheiden lassen. Weiterhin wurde gezeigt, dass sich der Signalverlauf einer binären Mischung zweier verschiedener MB-Typen als gradueller Übergang der Verläufe der beiden reinen MB-Lösungen ergibt. Eine weitere Analysemethode für einen multiparametrischen Immunoassay besteht darin, ein zusätzliches einstellbares statisches magnetisches Offsetfeld zu verwenden. Hierfür wurden mehrere Aufbauten auf Basis von Permanent- und Elektromagneten simuliert, konstruiert und charakterisiert. Mithilfe von Simulationen konnte gezeigt werden, dass eine auf diesem Verfahren beruhende Unterscheidung für MBs mit unterschiedlichen magnetischen Partikelmomenten möglich ist. Als direkte Anwendung des hier entwickelten Magnetreaders in Zusammenspiel mit der digitalen Demodulation wurde ein magnetischer Assay gegen die B-Untereinheit des Choleratoxins in Trinkwasser mit einem niedrigen Detektionslimit von 0,2 ng/ml demonstriert.
A German–Brazilian research project investigates sugarcane as an energy plant in anaerobic digestion for biogas production. The aim of the project is a continuous, efficient, and stable biogas process with sugarcane as the substrate. Tests are carried out in a fermenter with a volume of 10 l.
In order to optimize the space–time load to achieve a stable process, a continuous process in laboratory scale has been devised. The daily feed in quantity and the harvest time of the substrate sugarcane has been varied. Analyses of the digester content were conducted twice per week to monitor the process: The ratio of inorganic carbon content to volatile organic acid content (VFA/TAC), the concentration of short-chain fatty acids, the organic dry matter, the pH value, and the total nitrogen, phosphate, and ammonium concentrations were monitored. In addition, the gas quality (the percentages of CO₂, CH₄, and H₂) and the quantity of the produced gas were analyzed.
The investigations have exhibited feasible and economical production of biogas in a continuous process with energy cane as substrate. With a daily feeding rate of 1.68gᵥₛ/l*d the average specific gas formation rate was 0.5 m3/kgᵥₛ. The long-term study demonstrates a surprisingly fast metabolism of short-chain fatty acids. This indicates a stable and less susceptible process compared to other substrates.
Extracellular acidification is a basic indicator for alterations in two vital metabolic pathways: glycolysis and cellular respiration. Measuring these alterations by monitoring extracellular acidification using cell-based biosensors such as LAPS plays an important role in studying these pathways whose disorders are associated with numerous diseases including cancer. However, the surface of the biosensors must be specially tailored to ensure high cell compatibility so that cells can represent more in vivo-like behavior, which is critical to gain more realistic in vitro results from the analyses, e.g., drug discovery experiments. In this work, O2 plasma patterning on the LAPS surface is studied to enhance surface features of the sensor chip, e.g., wettability and biofunctionality. The surface treated with O2 plasma for 30 s exhibits enhanced cytocompatibility for adherent CHO–K1 cells, which promotes cell spreading and proliferation. The plasma-modified LAPS chip is then integrated into a microfluidic system, which provides two identical channels to facilitate differential measurements of the extracellular acidification of CHO–K1 cells. To the best of our knowledge, it is the first time that extracellular acidification within microfluidic channels is quantitatively visualized as differential (bio-)chemical images.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
There is a very large number of very important situations which can be modeled with nonlinear parabolic partial differential equations (PDEs) in several dimensions. In general, these PDEs can be solved by discretizing in the spatial variables and transforming them into huge systems of ordinary differential equations (ODEs), which are very stiff. Therefore, standard explicit methods require a large number of iterations to solve stiff problems. But implicit schemes are computationally very expensive when solving huge systems of nonlinear ODEs. Several families of Extrapolated Stabilized Explicit Runge-Kutta schemes (ESERK) with different order of accuracy (3 to 6) are derived and analyzed in this work. They are explicit methods, with stability regions extended, along the negative real semi-axis, quadratically with respect to the number of stages s, hence they can be considered to solve stiff problems much faster than traditional explicit schemes. Additionally, they allow the adaptation of the step length easily with a very small cost.
Two new families of ESERK schemes (ESERK3 and ESERK6) are derived, and analyzed, in this work. Each family has more than 50 new schemes, with up to 84.000 stages in the case of ESERK6. For the first time, we also parallelized all these new variable step length and variable number of stages algorithms (ESERK3, ESERK4, ESERK5, and ESERK6). These parallelized strategies allow to decrease times significantly, as it is discussed and also shown numerically in two problems. Thus, the new codes provide very good results compared to other well-known ODE solvers. Finally, a new strategy is proposed to increase the efficiency of these schemes, and it is discussed the idea of combining ESERK families in one code, because typically, stiff problems have different zones and according to them and the requested tolerance the optimum order of convergence is different.
The industrial revolution especially in the IR4.0 era have driven many states of the art technologies to be introduced.
The automotive industry as well as many other key industries have also been greatly influenced. The rapid development of automotive industries in Europe have created wide industry gap between European Union (EU) and developing countries such as in South East Asia (SEA). Indulging this situation, FH JOANNEUM, Austria together with European partners from FH Aachen, Germany and Politecnico di Torino, Italy are taking initiative to close down the gap utilizing the Erasmus+ United Capacity Building in Higher Education grant from EU. A consortium was founded to engage with automotive technology transfer using the European framework to Malaysian, Indonesian and Thailand Higher Education Institutions (HEI) as well as automotive industries in respective countries. This could be achieved by establishing Engineering Knowledge Transfer Unit (EKTU) in respective SEA institutions guided by the industry partners in their respective countries. This EKTU could offer updated, innovative and high-quality training courses to increase graduate’s employability in higher education institutions and strengthen relations between HEI and the wider economic and social environment by addressing University-industry cooperation which is the regional priority for Asia. It is expected that, the Capacity Building Initiative would improve the quality of higher education and enhancing its relevance for the labor market and society in the SEA partners. The outcome of this project would greatly benefit the partners in strong and complementary partnership targeting the automotive industry and enhanced larger scale international cooperation between the European and SEA partners. It would also prepare the SEA HEI in sustainable partnership with Automotive industry in the region as a mean of income generation in the future.
The Rothman–Woodroofe symmetry test statistic is revisited on the basis of independent but not necessarily identically distributed random variables. The distribution-freeness if the underlying distributions are all symmetric and continuous is obtained. The results are applied for testing symmetry in a meta-analysis random effects model. The consistency of the procedure is discussed in this situation as well. A comparison with an alternative proposal from the literature is conducted via simulations. Real data are analyzed to demonstrate how the new approach works in practice.
The Atmospheric Remote-Sensing Infrared Exoplanet Large-survey, ARIEL, has been selected to be the next (M4) medium class space mission in the ESA Cosmic Vision programme. From launch in 2028, and during the following 4 years of operation, ARIEL will perform precise spectroscopy of the atmospheres of ~1000 known transiting exoplanets using its metre-class telescope. A three-band photometer and three spectrometers cover the 0.5 µm to 7.8 µm region of the electromagnetic spectrum. This paper gives an overview of the mission payload, including the telescope assembly, the FGS (Fine Guidance System) - which provides both pointing information to the spacecraft and scientific photometry and low-resolution spectrometer data, the ARIEL InfraRed Spectrometer (AIRS), and other payload infrastructure such as the warm electronics, structures and cryogenic cooling systems.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
Domain experts regularly teach novice students how to perform a task. This often requires them to adjust their behavior to the less knowledgeable audience and, hence, to behave in a more didactic manner. Eye movement modeling examples (EMMEs) are a contemporary educational tool for displaying experts’ (natural or didactic) problem-solving behavior as well as their eye movements to learners. While research on expert-novice communication mainly focused on experts’ changes in explicit, verbal communication behavior, it is as yet unclear whether and how exactly experts adjust their nonverbal behavior. This study first investigated whether and how experts change their eye movements and mouse clicks (that are displayed in EMMEs) when they perform a task naturally versus teach a task didactically. Programming experts and novices initially debugged short computer codes in a natural manner. We first characterized experts’ natural problem-solving behavior by contrasting it with that of novices. Then, we explored the changes in experts’ behavior when being subsequently instructed to model their task solution didactically. Experts became more similar to novices on measures associated with experts’ automatized processes (i.e., shorter fixation durations, fewer transitions between code and output per click on the run button when behaving didactically). This adaptation might make it easier for novices to follow or imitate the expert behavior. In contrast, experts became less similar to novices for measures associated with more strategic behavior (i.e., code reading linearity, clicks on run button) when behaving didactically.
This paper uses a quantitative analysis to examine the interdependence and impact of resource rents on socio-economic development from 2002 to 2017. Nigeria and Norway have been chosen as reference countries due to their abundance of natural resources by similar economic performance, while the ranking in the Human Development Index differs dramatically. As the Human Development Index provides insight into a country’s cultural and socio-economic characteristics and development in addition to economic indicators, it allows a comparison of the two countries. The hypothesis presented and discussed in this paper was researched before. A qualitative research approach was used in the author’s master’s thesis “The Human Development Index (HDI) as a Reflection of Resource Abundance (using Nigeria and Norway as a case study)” in 2018. The management of scarce resources is an important aspect in the development of modern countries and those on the threshold of becoming industrialised nations. The effects of a mistaken resource management are not only of a purely economic nature but also of a social and socio-economic nature. In order to present a partial aspect of these dependencies and influences this paper uses a quantitative analysis to examine the interdependence and impact of resource rents on socio-economic development from 2002 to 2017. Nigeria and Norway have been chosen as reference countries due to their abundance of natural resources by similar economic performance, while the ranking in the Human Development Index differs significantly. As the Human Development Index provides insight into a country’s cultural and socio-economic characteristics and development in addition to economic indicators, it allows a comparison of the two countries. This paper found out in a holistic perspective that (not or poorly managed) resource wealth in itself has a negative impact on socio-economic development and significantly reduces the productivity of the citizens of a state. This is expressed in particular for the years 2002 till 2017 in a negative correlation of GDP per capita and HDI value with the share respectively the size of resources in the GDP of a country.
Twee Kanten van één Medaille
(2020)
Elastic transmission eigenvalues and their computation via the method of fundamental solutions
(2020)
A stabilized version of the fundamental solution method to catch ill-conditioning effects is investigated with focus on the computation of complex-valued elastic interior transmission eigenvalues in two dimensions for homogeneous and isotropic media. Its algorithm can be implemented very shortly and adopts to many similar partial differential equation-based eigenproblems as long as the underlying fundamental solution function can be easily generated. We develop a corroborative approximation analysis which also implicates new basic results for transmission eigenfunctions and present some numerical examples which together prove successful feasibility of our eigenvalue recovery approach.
Large scale central receiver systems typically deploy between thousands to more than a hundred thousand heliostats. During solar operation, each heliostat is aligned individually in such a way that the overall surface normal bisects the angle between the sun’s position and the aim point coordinate on the receiver. Due to various tracking error sources, achieving accurate alignment ≤1 mrad for all the heliostats with respect to the aim points on the receiver without a calibration system can be regarded as unrealistic. Therefore, a calibration system is necessary not only to improve the aiming accuracy for achieving desired flux distributions but also to reduce or eliminate spillage. An overview of current larger-scale central receiver systems (CRS), tracking error sources and the basic requirements of an ideal calibration system is presented. Leading up to the main topic, a description of general and specific terms on the topics heliostat calibration and tracking control clarifies the terminology used in this work. Various figures illustrate the signal flows along various typical components as well as the corresponding monitoring or measuring devices that indicate or measure along the signal (or effect) chain. The numerous calibration systems are described in detail and classified in groups. Two tables allow the juxtaposition of the calibration methods for a better comparison. In an assessment, the advantages and disadvantages of individual calibration methods are presented.
We propose the so-called chance constrained programming model of stochastic programming theory to analyze limit and shakedown loads of structures under random strength with a lognormal distribution. A dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit and the shakedown limit. The edge-based smoothed finite element method (ES-FEM) is used with three-node linear triangular elements.