Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1926)
- Fachbereich Elektrotechnik und Informationstechnik (1149)
- Fachbereich Wirtschaftswissenschaften (1119)
- Fachbereich Energietechnik (1066)
- Fachbereich Chemie und Biotechnologie (892)
- Fachbereich Maschinenbau und Mechatronik (800)
- Fachbereich Luft- und Raumfahrttechnik (768)
- Fachbereich Bauingenieurwesen (664)
- IfB - Institut für Bioengineering (625)
- INB - Institut für Nano- und Biotechnologien (585)
Has Fulltext
- no (9276) (remove)
Language
Document Type
- Article (5514)
- Conference Proceeding (1413)
- Book (1057)
- Part of a Book (555)
- Patent (174)
- Bachelor Thesis (165)
- Report (82)
- Doctoral Thesis (79)
- Conference: Meeting Abstract (75)
- Other (67)
- Contribution to a Periodical (20)
- Review (18)
- Master's Thesis (17)
- Working Paper (13)
- Conference Poster (5)
- Habilitation (5)
- Preprint (5)
- Talk (5)
- Diploma Thesis (3)
- Part of a Periodical (2)
- Examination Thesis (1)
- Video (1)
Keywords
- Illustration (10)
- Nachhaltigkeit (10)
- Corporate Design (9)
- Erscheinungsbild (8)
- Gamification (8)
- Redesign (7)
- Animation (6)
- Datenschutz (6)
- Deutschland (6)
- Digitalisierung (6)
Direct methods comprising limit and shakedown analysis is a branch of computational mechanics. It plays a significant role in mechanical and civil engineering design. The concept of direct method aims to determinate the ultimate load bearing capacity of structures beyond the elastic range. For practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and onstraints. If strength and loading are random quantities, the problem of shakedown analysis is considered as stochastic programming. This paper presents a method so called chance constrained programming, an effective method of stochastic programming, to solve shakedown analysis problem under random condition of strength. In this our investigation, the loading is deterministic, the strength is distributed as normal or lognormal variables.
The development and operation of hybrid or purely electrically powered aircraft in regional air mobility is a significant challenge for the entire aviation sector. This technology is expected to lead to substantial advances in flight performance, energy efficiency, reliability, safety, noise reduction, and exhaust emissions. Nevertheless, any consumed energy results in heat or carbon dioxide emissions and limited electric energy storage capabilities suppress commercial use. Therefore, the significant challenges to achieving eco-efficient aviation are increased aircraft efficiency, the development of new energy storage technologies, and the optimization of flight operations. Two major approaches for higher eco-efficiency are identified: The first one, is to take horizontal and vertical atmospheric motion phenomena into account. Where, in particular, atmospheric waves hold exciting potential. The second one is the use of the regeneration ability of electric aircraft. The fusion of both strategies is expected to improve efficiency. The objective is to reduce energy consumption during flight while not neglecting commercial usability and convenient flight characteristics. Therefore, an optimized control problem based on a general aviation class aircraft has to be developed and validated by flight experiments. The formulated approach enables a development of detailed knowledge of the potential and limitations of optimizing flight missions, considering the capability of regeneration and atmospheric influences to increase efficiency and range.
It was generally believed that coal sources are not favorable as live-in habitats for microorganisms due to their recalcitrant chemical nature and negligible decomposition. However, accumulating evidence has revealed the presence of diverse microbial groups in coal environments and their significant metabolic role in coal biogeochemical dynamics and ecosystem functioning. The high oxygen content, organic fractions, and lignin-like structures of lower-rank coals may provide effective means for microbial attack, still representing a greatly unexplored frontier in microbiology. Coal degradation/conversion technology by native bacterial and fungal species has great potential in agricultural development, chemical industry production, and environmental rehabilitation. Furthermore, native microalgal species can offer a sustainable energy source and an excellent bioremediation strategy applicable to coal spill/seam waters. Additionally, the measures of the fate of the microbial community would serve as an indicator of restoration progress on post-coal-mining sites. This review puts forward a comprehensive vision of coal biodegradation and bioprocessing by microorganisms native to coal environments for determining their biotechnological potential and possible applications.
Lolium perenne (perennial ryegrass) is aproductive and high-quality forage grass indigenous to Southern Europe, temperate Asia, and North Africa. Nowadays it is widespread and the dominant grass species on green areas in temperate climates. This abundant source of biomass is suitable for the development of bioeconomic processes because of its high cellulose and water-soluble carbohydrate content. In this work, novel breeds of the perennial ryegrass are being examined with regards to their quality parameters and biotechnological utilization options within the context of bioeconomy. Three processing operations are presented. In the first process, the perennial ryegrass is pretreated by pressing or hydrothermal extraction to derive glucosevia subsequent enzymatic hydrolysis of cellulose. A yield of up to 82 % glucose was achieved when using the hydrothermal ex-traction as pretreatment. In a second process, the ryegrass is used to produce lactic acid in high concentrations. The influence of the growth conditions and the cutting time on the carboxylic acid yield is investigated. A yield of lactic acid of above 150 g kg⁻¹ dry matter was achieved. The third process is to use Lolium perenne as a substrate in the fermentation of K. marxianus for the microbial production of single-cell proteins. The perennial ryegrass is screw-pressed and the press juice is used as medium. When supplementing the press juice with yeast media components, a biomass concentration of up to 16 g L⁻¹ could be achieved.
Cell spraying has become a feasible application method for cell therapy and tissue engineering approaches. Different devices have been used with varying success. Often, twin-fluid atomizers are used, which require a high gas velocity for optimal aerosolization characteristics. To decrease the amount and velocity of required air, a custom-made atomizer was designed based on the effervescent principle. Different designs were evaluated regarding spray characteristics and their influence on human adipose-derived mesenchymal stromal cells. The arithmetic mean diameters of the droplets were 15.4–33.5 µm with decreasing diameters for increasing gas-to-liquid ratios. The survival rate was >90% of the control for the lowest gas-to-liquid ratio. For higher ratios, cell survival decreased to approximately 50%. Further experiments were performed with the design, which had shown the highest survival rates. After seven days, no significant differences in metabolic activity were observed. The apoptosis rates were not influenced by aerosolization, while high gas-to-liquid ratios caused increased necrosis levels. Tri-lineage differentiation potential into adipocytes, chondrocytes, and osteoblasts was not negatively influenced by aerosolization. Thus, the effervescent aerosolization principle was proven suitable for cell applications requiring reduced amounts of supplied air. This is the first time an effervescent atomizer was used for cell processing.
Having well-defined control strategies for fuel cells, that can efficiently detect errors and take corrective action is critically important for safety in all applications, and especially so in aviation. The algorithms not only ensure operator safety by monitoring the fuel cell and connected components, but also contribute to extending the health of the fuel cell, its durability and safe operation over its lifetime. While sensors are used to provide peripheral data surrounding the fuel cell, the internal states of the fuel cell cannot be directly measured. To overcome this restriction, Kalman Filter has been implemented as an internal state observer.
Other safety conditions are evaluated using real-time data from every connected sensor and corrective actions automatically take place to ensure safety. The algorithms discussed in this paper have been validated thorough Model-in-the-Loop (MiL) tests as well as practical validation at a dedicated test bench.
The industrial revolution IR4.0 era have driven many states of the art technologies to be introduced especially in the automotive industry. The rapid development of automotive industries in Europe have created wide industry gap between European Union (EU) and developing countries such as in South-East Asia (SEA). Indulging this situation, FH Joanneum, Austria together with European partners from FH Aachen, Germany and Politecnico Di Torino, Italy is taking initiative to close the gap utilizing the Erasmus+ United grant from EU. A consortium was founded to engage with automotive technology transfer using the European ramework to Malaysian, Indonesian and Thailand Higher Education Institutions (HEI) as well as automotive industries. This could be achieved by establishing Engineering Knowledge Transfer Unit (EKTU) in respective SEA institutions guided by the industry partners in their respective countries. This EKTU could offer updated, innovative, and high-quality training courses to increase graduate’s employability in higher education institutions and strengthen relations between HEI and the wider economic and social environment by addressing Universityindustry cooperation which is the regional priority for Asia. It is expected that, the Capacity Building Initiative would improve the quality of higher education and enhancing its relevance for the labor market and society in the SEA partners. The outcome of this project would greatly benefit the partners in strong and complementary partnership targeting the automotive industry and enhanced larger scale international cooperation between the European and SEA partners. It would also prepare the SEA HEI in sustainable partnership with Automotive industry in the region as a mean of income generation in the future.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Im Handel mit Kraftfahrzeugen gehören Aspekte des gutgläubigen Erwerbs zu den beinahe alltäglichen Standardproblemen. Der BGH fügt in seiner Entscheidung v. 23.9.2022–VZR148/21, MDR 2022, 1541 diesem im Detail breit gefächerten Themenfeld einen weiteren Mosaikstein hinzu: Der Erwerber erhielt das verkaufte Kfz ohne Übergabe einer Zulassungsbescheinigung Teil II, behauptet aber, diese Bescheinigung sei dem vom ihm eingeschalteten Vermittler bei Erwerb (als Fälschung) vorgelegt worden. Tatsächlich befand sich das Original durchgängig beim wahren Eigentümer, der nunmehr Herausgabe des Fahrzeugs verlangt. Der BGH schützt in dieser Gestaltung im Ergebnis den Erwerber. Die Entscheidung ist in mehrfacher Hinsicht bemerkenswert.
With proven impact of statistical fracture analysis on fracture classifications, it is desirable to minimize the manual work and to maximize repeatability of this approach. We address this with an algorithm that reduces the manual effort to segmentation, fragment identification and reduction. The fracture edge detection and heat map generation are performed automatically. With the same input, the algorithm always delivers the same output. The tool transforms one intact template consecutively onto each fractured specimen by linear least square optimization, detects the fragment edges in the template and then superimposes them to generate a fracture probability heat map.
We hypothesized that the algorithm runs faster than the manual evaluation and with low (< 5 mm) deviation. We tested the hypothesis in 10 fractured proximal humeri and found that it performs with good accuracy (2.5 mm ± 2.4 mm averaged Euclidean distance) and speed (23 times faster). When applied to a distal humerus, a tibia plateau, and a scaphoid fracture, the run times were low (1–2 min), and the detected edges correct by visual judgement. In the geometrically complex acetabulum, at a run time of 78 min some outliers were considered acceptable. An automatically generated fracture probability heat map based on 50 proximal humerus fractures matches the areas of high risk of fracture reported in medical literature.
Such automation of the fracture analysis method is advantageous and could be extended to reduce the manual effort even further.
This work introduces a novel method for the detection of H₂O₂ vapor/aerosol of low concentrations, which is mainly applied in the sterilization of equipment in medical industry. Interdigitated electrode (IDE) structures have been fabricated by means of microfabrication techniques. A differential setup of IDEs was prepared, containing an active sensor element (active IDE) and a passive sensor element (passive IDE), where the former was immobilized with an enzymatic membrane of horseradish peroxidase that is selective towards H₂O₂. Changes in the IDEs’ capacitance values (active sensor element versus passive sensor element) under H₂O₂ vapor/aerosol atmosphere proved the detection in the concentration range up to 630 ppm with a fast response time (<60 s). The influence of relative humidity was also tested with regard to the sensor signal, showing no cross-sensitivity. The repeatability assessment of the IDE biosensors confirmed their stable capacitive signal in eight subsequent cycles of exposure to H₂O₂ vapor/aerosol. Room-temperature detection of H₂O₂ vapor/aerosol with such miniaturized biosensors will allow a future three-dimensional, flexible mapping of aseptic chambers and help to evaluate sterilization assurance in medical industry.
Inference on the basis of high-dimensional and functional data are two topics which are discussed frequently in the current statistical literature. A possibility to include both topics in a single approach is working on a very general space for the underlying observations, such as a separable Hilbert space. We propose a general method for consistently hypothesis testing on the basis of random variables with values in separable Hilbert spaces. We avoid concerns with the curse of dimensionality due to a projection idea. We apply well-known test statistics from nonparametric inference to the projected data and integrate over all projections from a specific set and with respect to suitable probability measures. In contrast to classical methods, which are applicable for real-valued random variables or random vectors of dimensions lower than the sample size, the tests can be applied to random vectors of dimensions larger than the sample size or even to functional and high-dimensional data. In general, resampling procedures such as bootstrap or permutation are suitable to determine critical values. The idea can be extended to the case of incomplete observations. Moreover, we develop an efficient algorithm for implementing the method. Examples are given for testing goodness-of-fit in a one-sample situation in [1] or for testing marginal homogeneity on the basis of a paired sample in [2]. Here, the test statistics in use can be seen as generalizations of the well-known Cramérvon-Mises test statistics in the one-sample and two-samples case. The treatment of other testing problems is possible as well. By using the theory of U-statistics, for instance, asymptotic null distributions of the test statistics are obtained as the sample size tends to infinity. Standard continuity assumptions ensure the asymptotic exactness of the tests under the null hypothesis and that the tests detect any alternative in the limit. Simulation studies demonstrate size and power of the tests in the finite sample case, confirm the theoretical findings, and are used for the comparison with concurring procedures. A possible application of the general approach is inference for stock market returns, also in high data frequencies. In the field of empirical finance, statistical inference of stock market prices usually takes place on the basis of related log-returns as data. In the classical models for stock prices, i.e., the exponential Lévy model, Black-Scholes model, and Merton model, properties such as independence and stationarity of the increments ensure an independent and identically structure of the data. Specific trends during certain periods of the stock price processes can cause complications in this regard. In fact, our approach can compensate those effects by the treatment of the log-returns as random vectors or even as functional data.
06| Warum es gemeinsam besser geht
10| Interview
14| Wer ist hier der Boss?
18| Schnittstelle zwischen Mensch und Technik
22| Zweite Heimat Jülich
28| Zwischen Angst und Hoffnung
32| Eine Sternstunde für die FH Aachen
36| Gegen alle Widerstände
38| Ein Ort, der bleibt
42| Der Aufblühende
46| Der Computer sitzt am Steuer
52| Da geht das Herz auf
54| Hoch hinaus
58| Beratungsangebote
60| Das alte Schätzchen
This thesis aims at the presentation and discussion of well-accepted and new
imaging techniques applied to different types of flow in common hydraulic
engineering environments. All studies are conducted in laboratory conditions and
focus on flow depth and velocity measurements. Investigated flows cover a wide
range of complexity, e.g. propagation of waves, dam-break flows, slightly and fully
aerated spillway flows as well as highly turbulent hydraulic jumps.
Newimagingmethods are compared to different types of sensorswhich are frequently
employed in contemporary laboratory studies. This classical instrumentation as well
as the general concept of hydraulic modeling is introduced to give an overview on
experimental methods.
Flow depths are commonly measured by means of ultrasonic sensors, also known as
acoustic displacement sensors. These sensors may provide accurate data with high
sample rates in case of simple flow conditions, e.g. low-turbulent clear water flows.
However, with increasing turbulence, higher uncertainty must be considered.
Moreover, ultrasonic sensors can provide point data only, while the relatively large
acoustic beam footprint may lead to another source of uncertainty in case of
relatively short, highly turbulent surface fluctuations (ripples) or free-surface
air-water flows. Analysis of turbulent length and time scales of surface fluctuations
from point measurements is also difficult. Imaging techniques with different
dimensionality, however, may close this gap. It is shown in this thesis that edge
detection methods (known from computer vision) may be used for two-dimensional
free-surface extraction (i.e. from images taken through transparant sidewalls in
laboratory flumes). Another opportunity in hydraulic laboratory studies comes with
the application of stereo vision. Low-cost RGB-D sensors can be used to gather
instantaneous, three-dimensional free-surface elevations, even in flows with very
high complexity (e.g. aerated hydraulic jumps). It will be shown that the uncertainty
of these methods is of similar order as for classical instruments.
Particle Image Velocimetry (PIV) is a well-accepted and widespread imaging
technique for velocity determination in laboratory conditions. In combination with
high-speed cameras, PIV can give time-resolved velocity fields in 2D/3D or even as
volumetric flow fields. PIV is based on a cross-correlation technique applied to small
subimages of seeded flows. The minimum size of these subimages defines the
maximum spatial resolution of resulting velocity fields. A derivative of PIV for
aerated flows is also available, i.e. the so-called Bubble Image Velocimetry (BIV). This
thesis emphasizes the capacities and limitations of both methods, using relatively
simple setups with halogen and LED illuminations. It will be demonstrated that
PIV/BIV images may also be processed by means of Optical Flow (OF) techniques.
OF is another method originating from the computer vision discipline, based on the
assumption of image brightness conservation within a sequence of images. The
Horn-Schunck approach, which has been first employed to hydraulic engineering
problems in the studies presented herein, yields dense velocity fields, i.e. pixelwise
velocity data. As discussed hereinafter, the accuracy of OF competes well with PIV
for clear-water flows and even improves results (compared to BIV) for aerated flow
conditions. In order to independently benchmark the OF approach, synthetic images
with defined turbulence intensitiy are used.
Computer vision offers new opportunities that may help to improve the
understanding of fluid mechanics and fluid-structure interactions in laboratory
investigations. In prototype environments, it can be employed for obstacle detection
(e.g. identification of potential fish migration corridors) and recognition (e.g. fish
species for monitoring in a fishway) or surface reconstruction (e.g. inspection of
hydraulic structures). It can thus be expected that applications to hydraulic
engineering problems will develop rapidly in near future. Current methods have not
been developed for fluids in motion. Systematic future developments are needed to
improve the results in such difficult conditions.
High aerodynamic efficiency requires propellers with high aspect ratios, while propeller sweep potentially reduces noise. Propeller sweep and high aspect ratios increase elasticity and coupling of structural mechanics and aerodynamics, affecting the propeller performance and noise. Therefore, this paper analyzes the influence of elasticity on forward-swept, backward-swept, and unswept propellers in hover conditions. A reduced-order blade element momentum approach is coupled with a one-dimensional Timoshenko beam theory and Farassat's formulation 1A. The results of the aeroelastic simulation are used as input for the aeroacoustic calculation. The analysis shows that elasticity influences noise radiation because thickness and loading noise respond differently to deformations. In the case of the backward-swept propeller, the location of the maximum sound pressure level shifts forward by 0.5 °, while in the case of the forward-swept propeller, it shifts backward by 0.5 °. Therefore, aeroacoustic optimization requires the consideration of propeller deformation.
Even the shortest flight through unknown, cluttered environments requires reliable local path planning algorithms to avoid unforeseen obstacles. The algorithm must evaluate alternative flight paths and identify the best path if an obstacle blocks its way. Commonly, weighted sums are used here. This work shows that weighted Chebyshev distances and factorial achievement scalarising functions are suitable alternatives to weighted sums if combined with the 3DVFH* local path planning algorithm. Both methods considerably reduce the failure probability of simulated flights in various environments. The standard 3DVFH* uses a weighted sum and has a failure probability of 50% in the test environments. A factorial achievement scalarising function, which minimises the worst combination of two out of four objective functions, reaches a failure probability of 26%; A weighted Chebyshev distance, which optimises the worst objective, has a failure probability of 30%. These results show promise for further enhancements and to support broader applicability.
Ambitious climate targets affect the competitiveness of industries in the international market. To prevent such industries from moving to other countries in the wake of increased climate protection efforts, cost adjustments may become necessary. Their design requires knowledge of country-specific production costs. Here, we present country-specific cost figures for different production routes of steel, paying particular attention to transportation costs. The data can be used in floor price models aiming to assess the competitiveness of different steel production routes in different countries (Rübbelke, 2022).