Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1692)
- Fachbereich Elektrotechnik und Informationstechnik (718)
- IfB - Institut für Bioengineering (624)
- Fachbereich Energietechnik (588)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (551)
- Fachbereich Luft- und Raumfahrttechnik (497)
- Fachbereich Maschinenbau und Mechatronik (279)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4925) (remove)
Document Type
- Article (3281)
- Conference Proceeding (1169)
- Part of a Book (194)
- Book (145)
- Doctoral Thesis (31)
- Conference: Meeting Abstract (28)
- Patent (25)
- Other (10)
- Report (9)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
A German–Brazilian research project investigates sugarcane as an energy plant in anaerobic digestion for biogas production. The aim of the project is a continuous, efficient, and stable biogas process with sugarcane as the substrate. Tests are carried out in a fermenter with a volume of 10 l.
In order to optimize the space–time load to achieve a stable process, a continuous process in laboratory scale has been devised. The daily feed in quantity and the harvest time of the substrate sugarcane has been varied. Analyses of the digester content were conducted twice per week to monitor the process: The ratio of inorganic carbon content to volatile organic acid content (VFA/TAC), the concentration of short-chain fatty acids, the organic dry matter, the pH value, and the total nitrogen, phosphate, and ammonium concentrations were monitored. In addition, the gas quality (the percentages of CO₂, CH₄, and H₂) and the quantity of the produced gas were analyzed.
The investigations have exhibited feasible and economical production of biogas in a continuous process with energy cane as substrate. With a daily feeding rate of 1.68gᵥₛ/l*d the average specific gas formation rate was 0.5 m3/kgᵥₛ. The long-term study demonstrates a surprisingly fast metabolism of short-chain fatty acids. This indicates a stable and less susceptible process compared to other substrates.
Extracellular acidification is a basic indicator for alterations in two vital metabolic pathways: glycolysis and cellular respiration. Measuring these alterations by monitoring extracellular acidification using cell-based biosensors such as LAPS plays an important role in studying these pathways whose disorders are associated with numerous diseases including cancer. However, the surface of the biosensors must be specially tailored to ensure high cell compatibility so that cells can represent more in vivo-like behavior, which is critical to gain more realistic in vitro results from the analyses, e.g., drug discovery experiments. In this work, O2 plasma patterning on the LAPS surface is studied to enhance surface features of the sensor chip, e.g., wettability and biofunctionality. The surface treated with O2 plasma for 30 s exhibits enhanced cytocompatibility for adherent CHO–K1 cells, which promotes cell spreading and proliferation. The plasma-modified LAPS chip is then integrated into a microfluidic system, which provides two identical channels to facilitate differential measurements of the extracellular acidification of CHO–K1 cells. To the best of our knowledge, it is the first time that extracellular acidification within microfluidic channels is quantitatively visualized as differential (bio-)chemical images.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
There is a very large number of very important situations which can be modeled with nonlinear parabolic partial differential equations (PDEs) in several dimensions. In general, these PDEs can be solved by discretizing in the spatial variables and transforming them into huge systems of ordinary differential equations (ODEs), which are very stiff. Therefore, standard explicit methods require a large number of iterations to solve stiff problems. But implicit schemes are computationally very expensive when solving huge systems of nonlinear ODEs. Several families of Extrapolated Stabilized Explicit Runge-Kutta schemes (ESERK) with different order of accuracy (3 to 6) are derived and analyzed in this work. They are explicit methods, with stability regions extended, along the negative real semi-axis, quadratically with respect to the number of stages s, hence they can be considered to solve stiff problems much faster than traditional explicit schemes. Additionally, they allow the adaptation of the step length easily with a very small cost.
Two new families of ESERK schemes (ESERK3 and ESERK6) are derived, and analyzed, in this work. Each family has more than 50 new schemes, with up to 84.000 stages in the case of ESERK6. For the first time, we also parallelized all these new variable step length and variable number of stages algorithms (ESERK3, ESERK4, ESERK5, and ESERK6). These parallelized strategies allow to decrease times significantly, as it is discussed and also shown numerically in two problems. Thus, the new codes provide very good results compared to other well-known ODE solvers. Finally, a new strategy is proposed to increase the efficiency of these schemes, and it is discussed the idea of combining ESERK families in one code, because typically, stiff problems have different zones and according to them and the requested tolerance the optimum order of convergence is different.
The industrial revolution especially in the IR4.0 era have driven many states of the art technologies to be introduced.
The automotive industry as well as many other key industries have also been greatly influenced. The rapid development of automotive industries in Europe have created wide industry gap between European Union (EU) and developing countries such as in South East Asia (SEA). Indulging this situation, FH JOANNEUM, Austria together with European partners from FH Aachen, Germany and Politecnico di Torino, Italy are taking initiative to close down the gap utilizing the Erasmus+ United Capacity Building in Higher Education grant from EU. A consortium was founded to engage with automotive technology transfer using the European framework to Malaysian, Indonesian and Thailand Higher Education Institutions (HEI) as well as automotive industries in respective countries. This could be achieved by establishing Engineering Knowledge Transfer Unit (EKTU) in respective SEA institutions guided by the industry partners in their respective countries. This EKTU could offer updated, innovative and high-quality training courses to increase graduate’s employability in higher education institutions and strengthen relations between HEI and the wider economic and social environment by addressing University-industry cooperation which is the regional priority for Asia. It is expected that, the Capacity Building Initiative would improve the quality of higher education and enhancing its relevance for the labor market and society in the SEA partners. The outcome of this project would greatly benefit the partners in strong and complementary partnership targeting the automotive industry and enhanced larger scale international cooperation between the European and SEA partners. It would also prepare the SEA HEI in sustainable partnership with Automotive industry in the region as a mean of income generation in the future.
The Rothman–Woodroofe symmetry test statistic is revisited on the basis of independent but not necessarily identically distributed random variables. The distribution-freeness if the underlying distributions are all symmetric and continuous is obtained. The results are applied for testing symmetry in a meta-analysis random effects model. The consistency of the procedure is discussed in this situation as well. A comparison with an alternative proposal from the literature is conducted via simulations. Real data are analyzed to demonstrate how the new approach works in practice.
The Atmospheric Remote-Sensing Infrared Exoplanet Large-survey, ARIEL, has been selected to be the next (M4) medium class space mission in the ESA Cosmic Vision programme. From launch in 2028, and during the following 4 years of operation, ARIEL will perform precise spectroscopy of the atmospheres of ~1000 known transiting exoplanets using its metre-class telescope. A three-band photometer and three spectrometers cover the 0.5 µm to 7.8 µm region of the electromagnetic spectrum. This paper gives an overview of the mission payload, including the telescope assembly, the FGS (Fine Guidance System) - which provides both pointing information to the spacecraft and scientific photometry and low-resolution spectrometer data, the ARIEL InfraRed Spectrometer (AIRS), and other payload infrastructure such as the warm electronics, structures and cryogenic cooling systems.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
Domain experts regularly teach novice students how to perform a task. This often requires them to adjust their behavior to the less knowledgeable audience and, hence, to behave in a more didactic manner. Eye movement modeling examples (EMMEs) are a contemporary educational tool for displaying experts’ (natural or didactic) problem-solving behavior as well as their eye movements to learners. While research on expert-novice communication mainly focused on experts’ changes in explicit, verbal communication behavior, it is as yet unclear whether and how exactly experts adjust their nonverbal behavior. This study first investigated whether and how experts change their eye movements and mouse clicks (that are displayed in EMMEs) when they perform a task naturally versus teach a task didactically. Programming experts and novices initially debugged short computer codes in a natural manner. We first characterized experts’ natural problem-solving behavior by contrasting it with that of novices. Then, we explored the changes in experts’ behavior when being subsequently instructed to model their task solution didactically. Experts became more similar to novices on measures associated with experts’ automatized processes (i.e., shorter fixation durations, fewer transitions between code and output per click on the run button when behaving didactically). This adaptation might make it easier for novices to follow or imitate the expert behavior. In contrast, experts became less similar to novices for measures associated with more strategic behavior (i.e., code reading linearity, clicks on run button) when behaving didactically.
Elastic transmission eigenvalues and their computation via the method of fundamental solutions
(2020)
A stabilized version of the fundamental solution method to catch ill-conditioning effects is investigated with focus on the computation of complex-valued elastic interior transmission eigenvalues in two dimensions for homogeneous and isotropic media. Its algorithm can be implemented very shortly and adopts to many similar partial differential equation-based eigenproblems as long as the underlying fundamental solution function can be easily generated. We develop a corroborative approximation analysis which also implicates new basic results for transmission eigenfunctions and present some numerical examples which together prove successful feasibility of our eigenvalue recovery approach.
In this paper we present SMART-FACTORY, a setup for a research and teaching facility in industrial robotics that is based on the RoboCup Logistics League. It is driven by the need for developing and applying solutions for digital production. Digitization receives constantly increasing attention in many areas, especially in industry. The common theme is to make things smart by using intelligent computer technology. Especially in the last decade there have been many attempts to improve existing processes in factories, for example, in production logistics, also with deploying cyber-physical systems. An initiative that explores challenges and opportunities for robots in such a setting is the RoboCup Logistics League. Since its foundation in 2012 it is an international effort for research and education in an intra-warehouse logistics scenario. During seven years of competition a lot of knowledge and experience regarding autonomous robots was gained. This knowledge and experience shall provide the basis for further research in challenges of future production. The focus of our SMART-FACTORY is to create a stimulating environment for research on logistics robotics, for teaching activities in computer science and electrical engineering programmes as well as for industrial users to study and explore the feasibility of future technologies. Building on a very successful history in the RoboCup Logistics League we aim to provide stakeholders with a dedicated facility oriented at their individual needs.
Innovative breeds of sugar cane yield up to 2.5 times as much organic matter as conventional breeds, resulting in a great potential for biogas production. The use of biogas production as a complementary solution to conventional and second-generation ethanol production in Brazil may increase the energy produced per hectare in the sugarcane sector. Herein, it was demonstrated that through ensiling, energy cane can be conserved for six months; the stored cane can then be fed into a continuous biogas process. This approach is necessary to achieve year-round biogas production at an industrial scale. Batch tests revealed specific biogas potentials between 400 and 600 LN/kgVS for both the ensiled and non-ensiled energy cane, and the specific biogas potential of a continuous biogas process fed with ensiled energy cane was in the same range. Peak biogas losses through ensiling of up to 27% after six months were observed. Finally, compared with second-generation ethanol production using energy cane, the results indicated that biogas production from energy cane may lead to higher energy yields per hectare, with an average energy yield of up to 162 MWh/ha. Finally, the Farm²CBG concept is introduced, showing an approach for decentralized biogas production.
In this article, a concept of implicit methods for scalar conservation laws in one or more spatial dimensions allowing also for source terms of various types is presented. This material is a significant extension of previous work of the first author (Breuß SIAM J. Numer. Anal. 43(3), 970–986 2005). Implicit notions are developed that are centered around a monotonicity criterion. We demonstrate a connection between a numerical scheme and a discrete entropy inequality, which is based on a classical approach by Crandall and Majda. Additionally, three implicit methods are investigated using the developed notions. Next, we conduct a convergence proof which is not based on a classical compactness argument. Finally, the theoretical results are confirmed by various numerical tests.
The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications.
Interior transmission eigenvalue problems for the Helmholtz equation play an important role in inverse wave scattering. Some distribution properties of those eigenvalues in the complex plane are reviewed. Further, a new scattering model for the interior transmission eigenvalue problem with mixed boundary conditions is described and an efficient algorithm for computing the interior transmission eigenvalues is proposed. Finally, extensive numerical results for a variety of two-dimensional scatterers are presented to show the validity of the proposed scheme.
We present new numerical results for shape optimization problems of interior Neumann eigenvalues. This field is not well understood from a theoretical standpoint. The existence of shape maximizers is not proven beyond the first two eigenvalues, so we study the problem numerically. We describe a method to compute the eigenvalues for a given shape that combines the boundary element method with an algorithm for nonlinear eigenvalues. As numerical optimization requires many such evaluations, we put a focus on the efficiency of the method and the implemented routine. The method is well suited for parallelization. Using the resulting fast routines and a specialized parametrization of the shapes, we found improved maxima for several eigenvalues.
As part of the transnational research project EDITOR, a parabolic trough collector system (PTC) with concrete thermal energy storage (C-TES) was installed and commissioned in Limassol, Cyprus. The system is located on the premises of the beverage manufacturer KEAN Soft Drinks Ltd. and its function is to supply process steam for the factory's pasteurisation process [1]. Depending on the factory's seasonally varying capacity for beverage production, the solar system delivers between 5 and 25 % of the total steam demand. In combination with the C-TES, the solar plant can supply process steam on demand before sunrise or after sunset. Furthermore, the C-TES compensates the PTC during the day in fluctuating weather conditions. The parabolic trough collector as well as the control and oil handling unit is designed and manufactured by Protarget AG, Germany. The C-TES is designed and produced by CADE Soluciones de Ingeniería, S.L., Spain. In the focus of this paper is the description of the operational experience with the PTC, C-TES and boiler during the commissioning and operation phase. Additionally, innovative optimisation measures are presented.
This publication is intended to present the current state of research on the rebound effect. First, a systematic literature review is carried out to outline (current) scientific models and theories. Research Question 1 follows with a mathematical introduction of the rebound effect, which shows the interdependence of consumer behaviour, technological progress, and interwoven effects for both. Thereupon, the research field is analysed for gaps and limitations by a systematic literature review. To ensure quantitative and qualitative results, a review protocol is used that integrates two different stages and covers all relevant publications released between 2000 and 2019. Accordingly, 392 publications were identified that deal with the rebound effect. These papers were reviewed to obtain relevant information on the two research questions. The literature review shows that research on the rebound effect is not yet comprehensive and focuses mainly on the effect itself rather than solutions to avoid it. Research Question 2 finds that the main gap, and thus the limitations, is that not much research has been published on the actual avoidance of the rebound effect yet. This is a major limitation for practical application by decision-makers and politicians. Therefore, a theoretical analysis was carried out to identify potential theories and ideas to avoid the rebound effect. The most obvious idea to solve this problem is the theory of a Steady-State Economy (SSE), which has been described and reviewed.
This paper uses a quantitative analysis to examine the interdependence and impact of resource rents on socio-economic development from 2002 to 2017. Nigeria and Norway have been chosen as reference countries due to their abundance of natural resources by similar economic performance, while the ranking in the Human Development Index differs dramatically. As the Human Development Index provides insight into a country’s cultural and socio-economic characteristics and development in addition to economic indicators, it allows a comparison of the two countries. The hypothesis presented and discussed in this paper was researched before. A qualitative research approach was used in the author’s master’s thesis “The Human Development Index (HDI) as a Reflection of Resource Abundance (using Nigeria and Norway as a case study)” in 2018. The management of scarce resources is an important aspect in the development of modern countries and those on the threshold of becoming industrialised nations. The effects of a mistaken resource management are not only of a purely economic nature but also of a social and socio-economic nature. In order to present a partial aspect of these dependencies and influences this paper uses a quantitative analysis to examine the interdependence and impact of resource rents on socio-economic development from 2002 to 2017. Nigeria and Norway have been chosen as reference countries due to their abundance of natural resources by similar economic performance, while the ranking in the Human Development Index differs significantly. As the Human Development Index provides insight into a country’s cultural and socio-economic characteristics and development in addition to economic indicators, it allows a comparison of the two countries. This paper found out in a holistic perspective that (not or poorly managed) resource wealth in itself has a negative impact on socio-economic development and significantly reduces the productivity of the citizens of a state. This is expressed in particular for the years 2002 till 2017 in a negative correlation of GDP per capita and HDI value with the share respectively the size of resources in the GDP of a country.
The successful implementation and continuous development of sustainable corporate-level solutions is a challenge. These are endeavours in which social, environmental, and financial aspects must be weighed against each other. They can prove difficult to handle and, in some cases, almost unrealistic. Concepts such as green controlling, IT, and manufacturing look promising and are constantly evolving. This paper aims to achieve a better understanding of the field of corporate sustainability (CS). It will evaluate the hypothesis by which Corporate Sustainability thrives, via being efficient, increasing the performance, and raising the value of the input of the enterprises to the resources used. In fact, Corporate Sustainability on the surface could seem to contradict the idea, which supports the understanding that it encourages the reduction of the heavy reliance on the use of natural resources, the overall environmental impact, and above all, their protection. To understand how the contradictory notion of CS came about, in this part of the paper, the emphasis is placed on providing useful insight to this regard. The first part of this paper summarizes various definitions, organizational theories, and measures used for CS and its derivatives like green controlling, IT, and manufacturing. Second, a case study is given that combines the aforementioned sustainability models. In addition to evaluating the hypothesis, the overarching objective of this paper is to demonstrate the use of green controlling, IT, and manufacturing in the corporate sector. Furthermore, this paper outlines the current challenges and possible directions for CS in the future.
Rapid development of virtual and data acquisition technology makes Digital Twin Technology (DT) one of the fundamental areas of research, while DT is one of the most promissory developments for the achievement of Industry 4.0. 48% percent of organisations implementing the Internet of Things are already using DT or plan to use DT in 2020. The global market for DT is expected to grow by 38 percent annually, reaching USD16 billion by 2023. In addition, the number of participating organisations using digital twins is expected to triple by 2022. DTs are characterised by the integration between physical and virtual spaces. The driving idea for DT is to develop, test and build our devices in a virtual environment. The objective of this paper is to study the impact of DT in the automotive industry on the new marketing logic. This paper outlines the current challenges and possible directions for the future DT in marketing. This paper will be helpful for managers in the industry to use the advantages and potentials of DT.
As researchers continue to seek the expansion of the material base for additive manufacturing, there is a need to focus attention on the Ni–Cu group of alloys which conventionally has wide industrial applications. In this work, the G-NiCu30Nb casting alloy, a variant of the Monel family of alloys with Nb and high Si content is, for the first time, processed via the laser powder bed fusion process (LPBF). Being novel to the LPBF processes, optimum LPBF parameters were determined, and hardness and tensile tests were performed in as-built conditions and after heat treatment at 1000 °C. Microstructures of the as-cast and the as-built condition were compared. Highly dense samples (99.8% density) were achieved after varying hatch distance (80 µm and 140 µm) with scanning speed (550 mm/s–1500 mm/s). There was no significant difference in microhardness between varied hatch distance print sets. Microhardness of the as-built condition (247 HV0.2) exceeded the as-cast microhardness (179 HV0.2.). Tensile specimens built in vertical (V) and horizontal (H) orientations revealed degrees of anisotropy and were superior to conventionally reported figures. Post heat treatment increased ductility from 20% to 31% (V), as well as from 16% to 25% (H), while ultimate tensile strength (UTS) and yield strength (YS) were considerably reduced.
Background: Architectural representation, nurtured by the interaction between design thinking and design action, is inherently multi-layered. However, the representation object cannot always reflect these layers. Therefore, it is claimed that these reflections and layerings can gain visibility through ‘performativity in personal knowledge’, which basically has a performative character. The specific layers of representation produced during the performativity in personal knowledge permit insights about the ‘personal way of designing’ [1]. Therefore, the question, ‘how can these layered drawings be decomposed to understand the personal way of designing’, can be defined as the beginning of the study. On the other hand, performativity in personal knowledge in architectural design is handled through the relationship between explicit and tacit knowledge and representational and non-representational theory. To discuss the practical dimension of these theoretical relations, Zvi Hecker's drawing of the Heinz-Galinski-School is examined as an example. The study aims to understand the relationships between the layers by decomposing a layered drawing analytically in order to exemplify personal ways of designing.
Methods: The study is based on qualitative research methodologies. First, a model has been formed through theoretical readings to discuss the performativity in personal knowledge. This model is used to understand the layered representations and to research the personal way of designing. Thus, one drawing of Hecker’s Heinz-Galinski-School project is chosen. Second, its layers are decomposed to detect and analyze diverse objects, which hint to different types of design tools and their application. Third, Zvi Hecker’s statements of the design process are explained through the interview data [2] and other sources. The obtained data are compared with each other.
Results: By decomposing the drawing, eleven layers are defined. These layers are used to understand the relation between the design idea and its representation. They can also be thought of as a reading system. In other words, a method to discuss Hecker’s performativity in personal knowledge is developed. Furthermore, the layers and their interconnections are described in relation to Zvi Hecker’s personal way of designing.
Conclusions: It can be said that layered representations, which are associated with the multilayered structure of performativity in personal knowledge, form the personal way of designing.
The minimum dissipation requirement of the thermodynamics of irreversible processes is applied to characterize the existence of laminar and non-laminar, and the co-existence of laminar and turbulent flow zones. Local limitations of the different zones and three different forms of transition are defined. For the Couette flow a non-local “corpuscular” flow mechanism explains the logarithmic law-of-the-wall, maximum turbulent dimensions and a value x=0,415 for the v. Kármán constant. Limitations of the logarithmic law near the wall and in the centre of the experiment are interpreted.
Past earthquakes demonstrated the high vulnerability of industrial facilities equipped with complex process technologies leading to serious damage of the process equipment and multiple and simultaneous release of hazardous substances in industrial facilities. Nevertheless, the design of industrial plants is inadequately described in recent codes and guidelines, as they do not consider the dynamic interaction between the structure and the installations and thus the effect of seismic response of the installations on the response of the structure and vice versa. The current code-based approach for the seismic design of industrial facilities is considered not enough for ensure proper safety conditions against exceptional event entailing loss of content and related consequences. Accordingly, SPIF project (Seismic Performance of Multi- Component Systems in Special Risk Industrial Facilities) was proposed within the framework of the European H2020 - SERA funding scheme (Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe). The objective of the SPIF project is the investigation of the seismic behavior of a representative industrial structure equipped with complex process technology by means of shaking table tests. The test structure is a three-story moment resisting steel frame with vertical and horizontal vessels and cabinets, arranged on the three levels and connected by pipes. The dynamic behavior of the test structure and installations is investigated with and without base isolation. Furthermore, both firmly anchored and isolated components are taken into account to compare their dynamic behavior and interactions with each other. Artificial and synthetic ground motions are applied to study the seismic response at different PGA levels. After each test, dynamic identification measurements are carried out to characterize the system condition. The contribution presents the numerical simulations to calibrate the tests on the prototype, the experimental setup of the investigated structure and installations, selected measurement data and finally describes preliminary experimental results.
Motile cilia are hair-like cell extensions present in multiple organs of the body. How cilia coordinate their regular beat in multiciliated epithelia to move fluids remains insufficiently understood, particularly due to lack of rigorous quantification. We combine here experiments, novel analysis tools, and theory to address this knowledge gap. We investigate collective dynamics of cilia in the zebrafish nose, due to its conserved properties with other ciliated tissues and its superior accessibility for non-invasive imaging. We revealed that cilia are synchronized only locally and that the size of local synchronization domains increases with the viscosity of the surrounding medium. Despite the fact that synchronization is local only, we observed global patterns of traveling metachronal waves across the multiciliated epithelium. Intriguingly, these global wave direction patterns are conserved across individual fish, but different for left and right nose, unveiling a chiral asymmetry of metachronal coordination. To understand the implications of synchronization for fluid pumping, we used a computational model of a regular array of cilia. We found that local metachronal synchronization prevents steric collisions and improves fluid pumping in dense cilia carpets, but hardly affects the direction of fluid flow. In conclusion, we show that local synchronization together with tissue-scale cilia alignment are sufficient to generate metachronal wave patterns in multiciliated epithelia, which enhance their physiological function of fluid pumping.
Past earthquakes demonstrated the high vulnerability of industrial facilities equipped with complex process technologies leading to serious damage of the process equipment and multiple and simultaneous release of hazardous substances in industrial facilities. Nevertheless, the design of industrial plants is inadequately described in recent codes and guidelines, as they do not consider the dynamic interaction between the structure and the installations and thus the effect of seismic response of the installations on the response of the structure and vice versa. The current code-based approach for the seismic design of industrial facilities is considered not enough for ensure proper safety conditions against exceptional event entailing loss of content and related consequences. Accordingly, SPIF project (Seismic Performance of Multi-Component Systems in Special Risk Industrial Facilities) was proposed within the framework of the European H2020 - SERA funding scheme (Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe). The objective of the SPIF project is the investigation of the seismic behaviour of a representative industrial structure equipped with complex process technology by means of shaking table tests. The test structure is a three-story moment resisting steel frame with vertical and horizontal vessels and cabinets, arranged on the three levels and connected by pipes. The dynamic behaviour of the test structure and of its relative several installations is investigated. Furthermore, both process components and primary structure interactions are considered and analyzed. Several PGA-scaled artificial ground motions are applied to study the seismic response at different levels. After each test, dynamic identification measurements are carried out to characterize the system condition. The contribution presents the experimental setup of the investigated structure and installations, selected measurement data and describes the obtained damage. Furthermore, important findings for the definition of performance limits, the effectiveness of floor response spectra in industrial facilities will be presented and discussed.
The paper presents an overview of the past and present of low-emission combustor research with hydrogen-rich fuels at Aachen University of Applied Sciences. In 1990, AcUAS started developing the Dry-Low-NOx Micromix combustion technology. Micromix reduces NOx emissions using jet-in-crossflow mixing of multiple miniaturized fuel jets and combustor air with an inherent safety against flashback. At first, pure hydrogen as fuel was investigated with lab-scale applications. Later, Micromix prototypes were developed for the use in an industrial gas turbine Honeywell/Garrett GTCP-36-300, proving low NOx characteristics during real gas turbine operation, accompanied by the successful definition of safety laws and control system modifications. Further, the Micromix was optimized for the use in annular and can combustors as well as for fuel-flexibility with hydrogen-methane-mixtures and hydrogen-rich syngas qualities by means of extensive experimental and numerical simulations. In 2020, the latest Micromix application will be demonstrated in a commercial 2 MW-class gas turbine can-combustor with full-scale engine operation. The paper discusses the advances in Micromix research over the last three decades.
Experimental investigation of behaviour of masonry infilled RC frames under out-of-plane loading
(2021)
Masonry infills are commonly used as exterior or interior walls in reinforced concrete (RC) frame structures and they can be encountered all over the world, including earthquake prone regions. Since the middle of the 20th century the behaviour of these non-structural elements under seismic loading has been studied in numerous experimental campaigns. However, most of the studies were carried out by means of in-plane tests, while there is a lack of out-of-plane experimental investigations. In this paper, the out-of-plane tests carried out on full scale masonry infilled frames are described. The results of the out-of-plane tests are presented in terms of force-displacement curves and measured out-of-plane displacements. Finally, the reliability of existing analytical approaches developed to estimate the out-of-plane strength of masonry infills is examined on presented experimental results.
This study focuses on thermoelectric elements (TEE) as an alternative for room temperature control. TEE are semi-conductor devices that can provide heating and cooling via a heat pump effect without direct noise emissions and no refrigerant use. An efficiency evaluation of the optimal operating mode is carried out for different numbers of TEE, ambient temperatures, and heating loads. The influence of an additional heat recovery unit on system efficiency and an unevenly distributed heating demand are examined. The results show that TEE can provide heat at a coefficient of performance (COP) greater than one especially for small heating demands and high ambient temperatures. The efficiency increases with the number of elements in the system and is subject to economies of scale. The best COP exceeds six at optimal operating conditions. An additional heat recovery unit proves beneficial for low ambient temperatures and systems with few TEE. It makes COPs above one possible at ambient temperatures below 0 ∘C. The effect increases efficiency by maximal 0.81 (from 1.90 to 2.71) at ambient temperature 5 K below room temperature and heating demand Q˙h=100W but is subject to diseconomies of scale. Thermoelectric technology is a valuable option for electricity-based heat supply and can provide cooling and ventilation functions. A careful system design as well as an additional heat recovery unit significantly benefits the performance. This makes TEE superior to direct current heating systems and competitive to heat pumps for small scale applications with focus on avoiding noise and harmful refrigerants.
Lignite biosolubilization and bioconversion by Bacillus sp.: the collation of analytical data
(2021)
The vast metabolic potential of microbes in brown coal (lignite) processing and utilization can greatly contribute to innovative approaches to sustainable production of high-value products from coal. In this study, the multi-faceted and complex coal biosolubilization process by Bacillus sp. RKB 7 isolate from the Kazakhstan coal-mining soil is reported, and the derived products are characterized. Lignite solubilization tests performed for surface and suspension cultures testify to the formation of numerous soluble lignite-derived substances. Almost 24% of crude lignite (5% w/v) was solubilized within 14 days under slightly alkaline conditions (pH 8.2). FTIR analysis revealed various functional groups in the obtained biosolubilization products. Analyses of the lignite-derived humic products by UV-Vis and fluorescence spectrometry as well as elemental analysis yielded compatible results indicating the emerging products had a lower molecular weight and degree of aromaticity. Furthermore, XRD and SEM analyses were used to evaluate the biosolubilization processes from mineralogical and microscopic points of view. The findings not only contribute to a deeper understanding of microbe–mineral interactions in coal environments, but also contribute to knowledge of coal biosolubilization and bioconversion with regard to sustainable production of humic substances. The detailed and comprehensive analyses demonstrate the huge biotechnological potential of Bacillus sp. for agricultural productivity and environmental health.
The low-pressure system Bernd involved extreme rainfalls in the Western part of Germany in July 2021,
resulting in major floods, severe damages and a tremendous number of casualties. Such extreme events
are rare and full flood protection can never be ensured with reasonable financial means. But still, this
event must be starting point to reconsider current design concepts. This article aims at sharing some
thoughts on potential hazards, the selection of return periods and remaining risk with the focus on Germany.
For now, the Planetary Defense Conference Exercise 2021's incoming fictitious(!), asteroid, 2021 PDC, seems headed for impact on October 20th, 2021, exactly 6 months after its discovery. Today (April 26th, 2021), the impact probability is 5%, in a steep rise from 1 in 2500 upon discovery six days ago. We all know how these things end. Or do we? Unless somebody kicked off another headline-grabbing media scare or wants to keep civil defense very idle very soon, chances are that it will hit (note: this is an exercise!). Taking stock, it is barely 6 months to impact, a steadily rising likelihood that it will actually happen, and a huge uncertainty of possible impact energies: First estimates range from 1.2 MtTNT to 13 GtTNT, and this is not even the worst-worst case: a 700 m diameter massive NiFe asteroid (covered by a thin veneer of Ryugu-black rubble to match size and brightness), would come in at 70 GtTNT. In down to Earth terms, this could be all between smashing fireworks over some remote area of the globe and a 7.5 km crater downtown somewhere. Considering the deliberate and sedate ways of development of interplanetary missions it seems we can only stand and stare until we know well enough where to tell people to pack up all that can be moved at all and save themselves. But then, it could just as well be a smaller bright rock. The best estimate is 120 m diameter from optical observation alone, by 13% standard albedo. NASA's upcoming DART mission to binary asteroid (65803) Didymos is designed to hit such a small target, its moonlet Dimorphos. The Deep Impact mission's impactor in 2005 successfully guided itself to the brightest spot on comet 9P/Tempel 1, a relatively small feature on the 6 km nucleus. And 'space' has changed: By the end of this decade, one satellite communication network plans to have launched over 11000 satellites at a pace of 60 per launch every other week. This level of series production is comparable in numbers to the most prolific commercial airliners. Launch vehicle production has not simply increased correspondingly – they can be reused, although in a trade for performance. Optical and radio astronomy as well as planetary radar have made great strides in the past decade, and so has the design and production capability for everyday 'high-tech' products. 60 years ago, spaceflight was invented from scratch within two years, and there are recent examples of fast-paced space projects as well as a drive towards 'responsive space'. It seems it is not quite yet time to abandon all hope. We present what could be done and what is too close to call once thinking is shoved out of the box by a clear and present danger, to show where a little more preparedness or routine would come in handy – or become decisive. And if we fail, let's stand and stare safely and well instrumented anywhere on Earth together in the greatest adventure of science.
This paper presents the laser-based powder bed fusion (L-PBF) using various glass powders (borosilicate and quartz glass). Compared to metals, these require adapted process strategies. First, the glass powders were characterized with regard to their material properties and their processability in the powder bed. This was followed by investigations of the melting behavior of the glass powders with different laser wavelengths (10.6 µm, 1070 nm). In particular, the experimental setup of a CO2 laser was adapted for the processing of glass powder. An experimental setup with integrated coaxial temperature measurement/control and an inductively heatable build platform was created. This allowed the L-PBF process to be carried out at the transformation temperature of the glasses. Furthermore, the component’s material quality was analyzed on three-dimensional test specimen with regard to porosity, roughness, density and geometrical accuracy in order to evaluate the developed L-PBF parameters and to open up possible applications.
Previous studies optimized the dimensions of coaxial heat exchangers using constant mass fow rates as a boundary condition. They show a thermal optimal circular ring width of nearly zero. Hydraulically optimal is an inner to outer pipe radius ratio of 0.65 for turbulent and 0.68 for laminar fow types. In contrast, in this study, fow conditions in the circular ring are kept constant (a set of fxed Reynolds numbers) during optimization. This approach ensures fxed fow conditions and prevents inappropriately high or low mass fow rates. The optimization is carried out for three objectives: Maximum energy gain, minimum hydraulic efort and eventually optimum net-exergy balance. The optimization changes the inner pipe radius and mass fow rate but not the Reynolds number of the circular ring. The thermal calculations base on Hellström’s borehole resistance and the hydraulic optimization on individually calculated linear loss of head coefcients. Increasing the inner pipe radius results in decreased hydraulic losses in the inner pipe but increased losses in the circular ring. The net-exergy diference is a key performance indicator and combines thermal and hydraulic calculations. It is the difference between thermal exergy fux and hydraulic efort. The Reynolds number in the circular ring is instead of the mass fow rate constant during all optimizations. The result from a thermal perspective is an optimal width of the circular ring of nearly zero. The hydraulically optimal inner pipe radius is 54% of the outer pipe radius for laminar fow and 60% for turbulent fow scenarios. Net-exergetic optimization shows a predominant infuence of hydraulic losses, especially for small temperature gains. The exact result depends on the earth’s thermal properties and the fow type. Conclusively, coaxial geothermal probes’ design should focus on the hydraulic optimum and take the thermal optimum as a secondary criterion due to the dominating hydraulics.
Digital Shadows as the aggregation, linkage and abstraction of data relating to physical objects are a central vision for the future of production. However, the majority of current research takes a technocentric approach, in which the human actors in production play a minor role. Here, the authors present an alternative anthropocentric perspective that highlights the potential and main challenges of extending the concept of Digital Shadows to humans. Following future research methodology, three prospections that illustrate use cases for Human Digital Shadows across organizational and hierarchical levels are developed: human-robot collaboration for manual work, decision support and work organization, as well as human resource management. Potentials and challenges are identified using separate SWOT analyses for the three prospections and common themes are emphasized in a concluding discussion.
An acetoin biosensor based on a capacitive electrolyte–insulator–semiconductor (EIS) structure modified with the enzyme acetoin reductase, also known as butane-2,3-diol dehydrogenase (Bacillus clausii DSM 8716ᵀ), is applied for acetoin detection in beer, red wine, and fermentation broth samples for the first time. The EIS sensor consists of an Al/p-Si/SiO₂/Ta₂O₅ layer structure with immobilized acetoin reductase on top of the Ta₂O₅ transducer layer by means of crosslinking via glutaraldehyde. The unmodified and enzyme-modified sensors are electrochemically characterized by means of leakage current, capacitance–voltage, and constant capacitance methods, respectively.
The paper presents the derivation of a new equivalent skin friction coefficient for estimating the parasitic drag of short-to-medium range fixed-wing unmanned aircraft. The new coefficient is derived from an aerodynamic analysis of ten different unmanned aircraft used for surveillance, reconnaissance, and search and rescue missions. The aircraft is simulated using a validated unsteady Reynolds-averaged Navier Stokes approach. The UAV’s parasitic drag is significantly influenced by the presence of miscellaneous components like fixed landing gears or electro-optical sensor turrets. These components are responsible for almost half of an unmanned aircraft’s total parasitic drag. The new equivalent skin friction coefficient accounts for these effects and is significantly higher compared to other aircraft categories. It is used to initially size an unmanned aircraft for a typical reconnaissance mission. The improved parasitic drag estimation yields a much heavier unmanned aircraft when compared to the sizing results using available drag data of manned aircraft.
Development of open educational resources for renewable energy and the energy transition process
(2021)
The dissemination of knowledge about renewable energies is understood as a social task with the highest topicality. The transfer of teaching content on renewable energies into digital open educational resources offers the opportunity to significantly accelerate the implementation of the energy transition. Thus, in the here presented project six German universities create open educational resources for the energy transition. These materials are available to the public on the internet under a free license. So far there has been no publicly accessible, editable media that cover entire learning units about renewable energies extensively and in high technical quality. Thus, in this project, the content that remains up-to-date for a longer period is appropriately prepared in terms of media didactics. The materials enable lecturers to provide students with in-depth training about technologies for the energy transition. In a particular way, the created material is also suitable for making the general public knowledgeable about the energy transition with scientifically based material.
Seismic vulnerability estimation of existing structures is unquestionably interesting topic of high priority, particularly after earthquake events. Having in mind the vast number of old masonry buildings in North Macedonia serving as public institutions, it is evident that the structural assessment of these buildings is an issue of great importance. In this paper, a comprehensive methodology for the development of seismic fragility curves of existing masonry buildings is presented. A scenario – based method that incorporates the knowledge of the tectonic style of the considered region, the active fault characterization, the earth crust model and the historical seismicity (determined via the Neo Deterministic approach) is used for calculation of the necessary response spectra. The capacity of the investigated masonry buildings has been determined by using nonlinear static analysis. MINEA software (SDA Engineering) is used for verification of the structural safety of the structures Performance point, obtained from the intersection of the capacity of the building and the spectra used, is selected as a response parameter. The thresholds of the spectral displacement are obtained by splitting the capacity curve into five parts, utilizing empirical formulas which are represented as a function of yield displacement and ultimate displacement. As a result, four levels of damage limit states are determined. A maximum likelihood estimation procedure for the process of fragility curves determination is noted as a final step in the proposed procedure. As a result, region specific series of vulnerability curves for structures are defined.
This paper describes the concept of an innovative, interdisciplinary, user-oriented earthquake warning and rapid response system coupled with a structural health monitoring system (SHM), capable to detect structural damages in real time. The novel system is based on interconnected decentralized seismic and structural health monitoring sensors. It is developed and will be exemplarily applied on critical infrastructures in Lower Rhine Region, in particular on a road bridge and within a chemical industrial facility. A communication network is responsible to exchange information between sensors and forward warnings and status reports about infrastructures’health condition to the concerned recipients (e.g., facility operators, local authorities). Safety measures such as emergency shutdowns are activated to mitigate structural damages and damage propagation. Local monitoring systems of the infrastructures are integrated in BIM models. The visualization of sensor data and the graphic representation of the detected damages provide spatial content to sensors data and serve as a useful and effective tool for the decision-making processes after an earthquake in the region under consideration.
Reinforced concrete frames with masonry infill walls are popular form of construction all over the world as well in seismic regions. While severe earthquakes can cause high level of damage of both reinforced concrete and masonry infills, earthquakes of lower to medium intensity some-times can cause significant level of damage of masonry infill walls. Especially important is the level of damage of face loaded infill masonry walls (out-of-plane direction) as out-of-plane load cannot only bring high level of damage to the wall, it can also be life-threating for the people near the wall. The response in out-of-plane direction directly depends on the prior in-plane damage, as previous investigation shown that it decreases resistance capacity of the in-fills. Behaviour of infill masonry walls with and without prior in-plane load is investigated in the experimental campaign and the results are presented in this paper. These results are later compared with analytical approaches for the out-of-plane resistance from the literature. Conclusions based on the experimental campaign on the influence of prior in-plane damage on the out-of-plane response of infill walls are compared with the conclusions from other authors who investigated the same problematic.