Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (124)
- IfB - Institut für Bioengineering (60)
- Fachbereich Energietechnik (27)
- INB - Institut für Nano- und Biotechnologien (24)
- Fachbereich Maschinenbau und Mechatronik (17)
- Fachbereich Luft- und Raumfahrttechnik (16)
- Fachbereich Wirtschaftswissenschaften (15)
- Fachbereich Chemie und Biotechnologie (14)
- Fachbereich Bauingenieurwesen (7)
- Fachbereich Elektrotechnik und Informationstechnik (7)
Has Fulltext
- yes (226) (remove)
Language
- English (226) (remove)
Document Type
- Conference Proceeding (132)
- Article (83)
- Lecture (5)
- Working Paper (4)
- Conference Poster (1)
- Talk (1)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Clusterion (5)
- shakedown analysis (5)
We consider a binary multivariate regression model where the conditional expectation of a binary variable given a higher-dimensional input variable belongs to a parametric family. Based on this, we introduce a model-based bootstrap (MBB) for higher-dimensional input variables. This test can be used to check whether a sequence of independent and identically distributed observations belongs to such a parametric family. The approach is based on the empirical residual process introduced by Stute (Ann Statist 25:613–641, 1997). In contrast to Stute and Zhu’s approach (2002) Stute & Zhu (Scandinavian J Statist 29:535–545, 2002), a transformation is not required. Thus, any problems associated with non-parametric regression estimation are avoided. As a result, the MBB method is much easier for users to implement. To illustrate the power of the MBB based tests, a small simulation study is performed. Compared to the approach of Stute & Zhu (Scandinavian J Statist 29:535–545, 2002), the simulations indicate a slightly improved power of the MBB based method. Finally, both methods are applied to a real data set.
Extension fractures are typical for the deformation under low or no confining pressure. They can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. In this article, it is shown that the simple extension strain criterion makes unrealistic strength predictions in biaxial compression and tension. To overcome this major limitation, a new extension strain criterion is proposed by adding a weighted principal shear component to the simple criterion. The shear weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting extension failure modes, which are unexpected in the classical understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain leading to dilatancy. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak stress CP. Different from compressive loading, tensile loading requires only a limited number of critical cracks to cause failure. Therefore, for tensile stresses, the failure criteria must be modified somehow, possibly by a cut-off corresponding to the CI stress. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Previous studies optimized the dimensions of coaxial heat exchangers using constant mass fow rates as a boundary condition. They show a thermal optimal circular ring width of nearly zero. Hydraulically optimal is an inner to outer pipe radius ratio of 0.65 for turbulent and 0.68 for laminar fow types. In contrast, in this study, fow conditions in the circular ring are kept constant (a set of fxed Reynolds numbers) during optimization. This approach ensures fxed fow conditions and prevents inappropriately high or low mass fow rates. The optimization is carried out for three objectives: Maximum energy gain, minimum hydraulic efort and eventually optimum net-exergy balance. The optimization changes the inner pipe radius and mass fow rate but not the Reynolds number of the circular ring. The thermal calculations base on Hellström’s borehole resistance and the hydraulic optimization on individually calculated linear loss of head coefcients. Increasing the inner pipe radius results in decreased hydraulic losses in the inner pipe but increased losses in the circular ring. The net-exergy diference is a key performance indicator and combines thermal and hydraulic calculations. It is the difference between thermal exergy fux and hydraulic efort. The Reynolds number in the circular ring is instead of the mass fow rate constant during all optimizations. The result from a thermal perspective is an optimal width of the circular ring of nearly zero. The hydraulically optimal inner pipe radius is 54% of the outer pipe radius for laminar fow and 60% for turbulent fow scenarios. Net-exergetic optimization shows a predominant infuence of hydraulic losses, especially for small temperature gains. The exact result depends on the earth’s thermal properties and the fow type. Conclusively, coaxial geothermal probes’ design should focus on the hydraulic optimum and take the thermal optimum as a secondary criterion due to the dominating hydraulics.
The paper presents the derivation of a new equivalent skin friction coefficient for estimating the parasitic drag of short-to-medium range fixed-wing unmanned aircraft. The new coefficient is derived from an aerodynamic analysis of ten different unmanned aircraft used for surveillance, reconnaissance, and search and rescue missions. The aircraft is simulated using a validated unsteady Reynolds-averaged Navier Stokes approach. The UAV’s parasitic drag is significantly influenced by the presence of miscellaneous components like fixed landing gears or electro-optical sensor turrets. These components are responsible for almost half of an unmanned aircraft’s total parasitic drag. The new equivalent skin friction coefficient accounts for these effects and is significantly higher compared to other aircraft categories. It is used to initially size an unmanned aircraft for a typical reconnaissance mission. The improved parasitic drag estimation yields a much heavier unmanned aircraft when compared to the sizing results using available drag data of manned aircraft.
Reliable automation of the labor-intensive manual task of scoring animal sleep can facilitate the analysis of long-term sleep studies. In recent years, deep-learning-based systems, which learn optimal features from the data, increased scoring accuracies for the classical sleep stages of Wake, REM, and Non-REM. Meanwhile, it has been recognized that the statistics of transitional stages such as pre-REM, found between Non-REM and REM, may hold additional insight into the physiology of sleep and are now under vivid investigation. We propose a classification system based on a simple neural network architecture that scores the classical stages as well as pre-REM sleep in mice. When restricted to the classical stages, the optimized network showed state-of-the-art classification performance with an out-of-sample F1 score of 0.95 in male C57BL/6J mice. When unrestricted, the network showed lower F1 scores on pre-REM (0.5) compared to the classical stages. The result is comparable to previous attempts to score transitional stages in other species such as transition sleep in rats or N1 sleep in humans. Nevertheless, we observed that the sequence of predictions including pre-REM typically transitioned from Non-REM to REM reflecting sleep dynamics observed by human scorers. Our findings provide further evidence for the difficulty of scoring transitional sleep stages, likely because such stages of sleep are under-represented in typical data sets or show large inter-scorer variability. We further provide our source code and an online platform to run predictions with our trained network.
Quantitative nuclear magnetic resonance (qNMR) is considered as a powerful tool for multicomponent mixture analysis as well as for the purity determination of single compounds. Special attention is currently paid to the training of operators and study directors involved in qNMR testing. To assure that only qualified personnel are used for sample preparation at our GxP-accredited laboratory, weighing test was proposed. Sixteen participants performed six-fold weighing of the binary mixture of dibutylated hydroxytoluene (BHT) and 1,2,4,5-tetrachloro-3-nitrobenzene (TCNB). To evaluate the quality of data analysis, all spectra were evaluated manually by a qNMR expert and using in-house developed automated routine. The results revealed that mean values are comparable and both evaluation approaches are free of systematic error. However, automated evaluation resulted in an approximately 20% increase in precision. The same findings were revealed for qNMR analysis of 32 compounds used in pharmaceutical industry. Weighing test by six-fold determination in binary mixtures and automated qNMR methodology can be recommended as efficient tools for evaluating staff proficiency. The automated qNMR method significantly increases throughput and precision of qNMR for routine measurements and extends application scope of qNMR.
The treatment method to deactivate viable microorganisms from objects or products is termed sterilization. There are multiple forms of sterilization, each intended to be applied for a specific target, which depends on—but not limited to—the thermal, physical, and chemical stability of that target. Herein, an overview on the currently used sterilization processes in the global market is provided. Different sterilization techniques are grouped under a category that describes the method of treatment: radiation (gamma, electron beam, X-ray, and ultraviolet), thermal (dry and moist heat), and chemical (ethylene oxide, ozone, chlorine dioxide, and hydrogen peroxide). For each sterilization process, the typical process parameters as defined by regulations and the mode of antimicrobial activity are summarized. Finally, the recommended microorganisms that are used as biological indicators to validate sterilization processes in accordance with the rules that are established by various regulatory agencies are summarized.
An acetoin biosensor based on a capacitive electrolyte–insulator–semiconductor (EIS) structure modified with the enzyme acetoin reductase, also known as butane-2,3-diol dehydrogenase (Bacillus clausii DSM 8716ᵀ), is applied for acetoin detection in beer, red wine, and fermentation broth samples for the first time. The EIS sensor consists of an Al/p-Si/SiO₂/Ta₂O₅ layer structure with immobilized acetoin reductase on top of the Ta₂O₅ transducer layer by means of crosslinking via glutaraldehyde. The unmodified and enzyme-modified sensors are electrochemically characterized by means of leakage current, capacitance–voltage, and constant capacitance methods, respectively.
This paper compares several blade element theory (BET) method-based propeller simulation tools, including an evaluation against static propeller ground tests and high-fidelity Reynolds-Average Navier Stokes (RANS) simulations. Two proprietary propeller geometries for paraglider applications are analysed in static and flight conditions. The RANS simulations are validated with the static test data and used as a reference for comparing the BET in flight conditions. The comparison includes the analysis of varying 2D aerodynamic airfoil parameters and different induced velocity calculation methods. The evaluation of the BET propeller simulation tools shows the strength of the BET tools compared to RANS simulations. The RANS simulations underpredict static experimental data within 10% relative error, while appropriate BET tools overpredict the RANS results by 15–20% relative error. A variation in 2D aerodynamic data depicts the need for highly accurate 2D data for accurate BET results. The nonlinear BET coupled with XFOIL for the 2D aerodynamic data matches best with RANS in static operation and flight conditions. The novel BET tool PropCODE combines both approaches and offers further correction models for highly accurate static and flight condition results.
In traditional microbial biobutanol production, the solvent must be recovered during fermentation process for a sufficient space-time yield. Thermal separation is not feasible due to the boiling point of n-butanol. As an integrated and selective solid-liquid separation alternative, solvent impregnated resins (SIRs) were applied. Two polymeric resins were evaluated and an extractant screening was conducted. Vacuum application with vapor collection in fixed-bed column as bioreactor bypass was successfully implemented as butanol desorption step. In course of further increasing process economics, fermentation with renewable lignocellulosic substrates was conducted using Clostridium acetobutylicum. Utilization of SIR was shown to be a potential strategy for solvent removal from fermentation broth, while application of a bypass column allows for product removal and recovery at once.
In this chapter, the key technologies and the instrumentation required for the subsurface exploration of ocean worlds are discussed. The focus is laid on Jupiter’s moon Europa and Saturn’s moon Enceladus because they have the highest potential for such missions in the near future. The exploration of their oceans requires landing on the surface, penetrating the thick ice shell with an ice-penetrating probe, and probably diving with an underwater vehicle through dozens of kilometers of water to the ocean floor, to have the chance to find life, if it exists. Technologically, such missions are extremely challenging. The required key technologies include power generation, communications, pressure resistance, radiation hardness, corrosion protection, navigation, miniaturization, autonomy, and sterilization and cleaning. Simpler mission concepts involve impactors and penetrators or – in the case of Enceladus – plume-fly-through missions.
For short take-off and landing (STOL) aircraft, a parallel hybrid-electric propulsion system potentially offers superior performance compared to a conventional propulsion system, because the short-take-off power requirement is much higher than the cruise power requirement. This power-matching problem can be solved with a balanced hybrid propulsion system. However, there is a trade-off between wing loading, power loading, the level of hybridization, as well as range and take-off distance. An optimization method can vary design variables in such a way that a minimum of a particular objective is attained. In this paper, a comparison between the optimization results for minimum mass, minimum consumed primary energy, and minimum cost is conducted. A new initial sizing algorithm for general aviation aircraft with hybrid-electric propulsion systems is applied. This initial sizing methodology covers point performance, mission performance analysis, the weight estimation process, and cost estimation. The methodology is applied to the design of a STOL general aviation aircraft, intended for on-demand air mobility operations. The aircraft is sized to carry eight passengers over a distance of 500 km, while able to take off and land from short airstrips. Results indicate that parallel hybrid-electric propulsion systems must be considered for future STOL aircraft.
A light-addressable potentiometric sensor (LAPS) is a field-effect-based (bio-) chemical sensor, in which a desired sensing area on the sensor surface can be defined by illumination. Light addressability can be used to visualize the concentration and spatial distribution of the target molecules, e.g., H+ ions. This unique feature has great potential for the label-free imaging of the metabolic activity of living organisms. The cultivation of those organisms needs specially tailored surface properties of the sensor. O2 plasma treatment is an attractive and promising tool for rapid surface engineering. However, the potential impacts of the technique are carefully investigated for the sensors that suffer from plasma-induced damage. Herein, a LAPS with a Ta2O5 pH-sensitive surface is successfully patterned by plasma treatment, and its effects are investigated by contact angle and scanning LAPS measurements. The plasma duration of 30 s (30 W) is found to be the threshold value, where excessive wettability begins. Furthermore, this treatment approach causes moderate plasma-induced damage, which can be reduced by thermal annealing (10 min at 300 °C). These findings provide a useful guideline to support future studies, where the LAPS surface is desired to be more hydrophilic by O2 plasma treatment.
Recent analysis of scientific data from Cassini and earth-based observations gave evidence for a global ocean under a surrounding solid ice shell on Saturn's moon Enceladus. Images of Enceladus' South Pole showed several fissures in the ice shell with plumes constantly exhausting frozen water particles, building up the E-Ring, one of the outer rings of Saturn. In this southern region of Enceladus, the ice shell is considered to be as thin as 2 km, about an order of magnitude thinner than on the rest of the moon. Under the ice shell, there is a global ocean consisting of liquid water. Scientists are discussing different approaches the possibilities of taking samples of water, i.e. by melting through the ice using a melting probe. FH Aachen UAS developed a prototype of maneuverable melting probe which can navigate through the ice that has already been tested successfully in a terrestrial environment. This means no atmosphere and or ambient pressure, low ice temperatures of around 100 to 150K (near the South Pole) and a very low gravity of 0,114 m/s^2 or 1100 μg. Two of these influencing measures are about to be investigated at FH Aachen UAS in 2017, low ice temperature and low ambient pressure below the triple point of water. Low gravity cannot be easily simulated inside a large experiment chamber, though. Numerical simulations of the melting process at RWTH Aachen however are showing a gravity dependence of melting behavior. Considering this aspect, VIPER provides a link between large-scale experimental simulations at FH Aachen UAS and numerical simulations at RWTH Aachen. To analyze the melting process, about 90 seconds of experiment time in reduced gravity and low ambient pressure is provided by the REXUS rocket. In this time frame, the melting speed and contact force between ice and probes are measured, as well as heating power and a two-dimensional array of ice temperatures. Additionally, visual and infrared cameras are used to observe the melting process.
Electromechanical model of hiPSC-derived ventricular cardiomyocytes cocultured with fibroblasts
(2018)
The CellDrum provides an experimental setup to study the mechanical effects of fibroblasts co-cultured with hiPSC-derived ventricular cardiomyocytes. Multi-scale computational models based on the Finite Element Method are developed. Coupled electrical cardiomyocyte-fibroblast models (cell level) are embedded into reaction-diffusion equations (tissue level) which compute the propagation of the action potential in the cardiac tissue. Electromechanical coupling is realised by an excitation-contraction model (cell level) and the active stress arising during contraction is added to the passive stress in the force balance, which determines the tissue displacement (tissue level). Tissue parameters in the model can be identified experimentally to the specific sample.
Research collaborations provide opportunities for both practitioners and researchers: practitioners need solutions for difficult business challenges and researchers are looking for hard problems to solve and publish. Nevertheless, research collaborations carry the risk that practitioners focus on quick solutions too much and that researchers tackle theoretical problems, resulting in products which do not fulfill the project requirements.
In this paper we introduce an approach extending the ideas of agile and lean software development. It helps practitioners and researchers keep track of their common research collaboration goal: a scientifically enriched software product which fulfills the needs of the practitioner’s business model.
This approach gives first-class status to application-oriented metrics that measure progress and success of a research collaboration continuously. Those metrics are derived from the collaboration requirements and help to focus on a commonly defined goal.
An appropriate tool set evaluates and visualizes those metrics with minimal effort, and all participants will be pushed to focus on their tasks with appropriate effort. Thus project status, challenges and progress are transparent to all research collaboration members at any time.
The Passivhaus building standard is a concept developed for the realization of energy-efficient and economical buildings with a simultaneous high utilization comfort under European climate conditions. Major elements of the Passivhaus concept are a high thermal insulation of the external walls, the use of heat and/or solar shading glazing as well as an airtight building envelope in combination with energy-efficient technical building installations and heating or cooling generators, such as an efficient energy-recovery in the building air-conditioning. The objective of this research project is the inquiry to determine the parameters or constraints under which the Passivhaus concept can be implemented under the arid climate conditions in the Arabian Peninsula to achieve an energy-efficient and economical building with high utilization comfort. In cooperation between the Qatar Green Building Council (QGBC), Barwa Real Estate (BRE) and Kahramaa the first Passivhaus was constructed in Qatar and on the Arabian Peninsula in 2013. The Solar-Institut Jülich of Aachen University of Applied Science supports the Qatar Green Building Council with a dynamic building and equipment simulation of the Passivhaus and the neighbouring reference building. This includes simulation studies with different component configurations for the building envelope and different control strategies for heating or cooling systems as well as the air conditioning of buildings to find an energetic-economical optimum. Part of these analyses is the evaluation of the energy efficiency of the used energy recovery system in the Passivhaus air-conditioning and identification of possible energy-saving effects by the use of a bypass function integrated in the heat exchanger. In this way it is expected that on an annual basis the complete electricity demand of the building can be covered by the roof-integrated PV generator.
To give the exchange of goods and services between the European Union (EU) and the United States (U.S.) new momentum the two parties are currently negotiating the transatlantic free trade agreement Transatlantic Trade and Investment Partnership (TTIP). The aim is to create the largest free trade area in the world. The agreement, once entered into force, will oblige EU countries and the U.S. to further liberalize their markets.
The negotiations on TTIP include a chapter on Electronic Communications/ Telecommunications. The challenge therein will be securing commitments for market access to Electronic Communications services. At the same time, these commitments must reflect the legitimate need for consumer protection issues. The need to reduce Electronic Communications-related non-tariff barriers to trade between the Parties is due to the fact that these markets are heavily regulated. Without transnational rules as to regulations national governments can abuse these regulations to deter the market entry by new (foreign) suppliers. Thus the free trade agreement TTIP affects in many respects regulatory provisions on and access to Electronic Communications markets. The objective of this paper is therefore to examine to what extend the regulatory principles for Electronic Communications markets envisaged under TTIP will result in trade facilitation and regulatory convergence between the EU and the U.S.
As to this question the result of the analysis is that the chapter on Electronic Communications will be an important step towards facilitating trade in Electronic Communications services. At the same time some regulatory convergence will take place, but this convergence will not lead to a (full) harmonization of regulations. Rather the norm, also after TTIP negotiations will have been concluded successfully, will be mutual recognition of different regulatory regimes. Different regulations being the optimal policy response in different market settings will continue to exist. Moreover, it is very unlikely that such regulatory principles for the Electronic Communications sector are a vehicle for a race to the bottom in levels of consumer protection.
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.