Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1562)
- Fachbereich Elektrotechnik und Informationstechnik (710)
- Fachbereich Energietechnik (560)
- IfB - Institut für Bioengineering (560)
- Fachbereich Chemie und Biotechnologie (537)
- INB - Institut für Nano- und Biotechnologien (533)
- Fachbereich Luft- und Raumfahrttechnik (479)
- Fachbereich Maschinenbau und Mechatronik (261)
- Fachbereich Wirtschaftswissenschaften (205)
- Solar-Institut Jülich (160)
Has Fulltext
- no (4678) (remove)
Language
- English (4678) (remove)
Document Type
- Article (3189)
- Conference Proceeding (1032)
- Part of a Book (190)
- Book (144)
- Doctoral Thesis (30)
- Conference: Meeting Abstract (27)
- Patent (25)
- Other (10)
- Report (9)
- Conference Poster (5)
Keywords
- Gamification (6)
- avalanche (6)
- Earthquake (5)
- Enterprise Architecture (5)
- MINLP (5)
- solar sail (5)
- Diversity Management (4)
- Energy storage (4)
- Engineering optimization (4)
- LAPS (4)
68Ga-radiopharmaceuticals are common in the field of Nuclear Medicine to visualize receptor-mediated processes. In contrast to straightforward labeling procedures for clinical applications, preclinical in vitro and in vivo applications are hampered for reasons like e.g. volume
restriction, activity concentration, molar activity and osmolality. Therefore, we developed a semiautomatic system specifically to overcome these problems. A difficulty appeared unexpectedly, as intrinsic trace metals derived from eluate (Zn, Fe and Cu) are concentrated as well in amounts that influence radiochemical yield and thus lower molar activity.
Semi-insulating GaAs layers grown by molecular-beam epitaxy / P. Kordos ; A. Förster ; J. Betko ...
(1995)
Based on an identifying Volterra type integral equation for randomly right censored observations from a lifetime distribution function F, we solve the corresponding estimating equation by an explicit and implicit Euler scheme. While the first approach results in some known estimators, the second one produces new semi-parametric and pre-smoothed Kaplan–Meier estimators which are real distribution functions rather than sub-distribution functions as the former ones are. This property of the new estimators is particular useful if one wants to estimate the expected lifetime restricted to the support of the observation time.
Specifically, we focus on estimation under the semi-parametric random censorship model (SRCM), that is, a random censorship model where the conditional expectation of the censoring indicator given the observation belongs to a parametric family. We show that some estimated linear functionals which are based on the new semi-parametric estimator are strong consistent, asymptotically normal, and efficient under SRCM. In a small simulation study, the performance of the new estimator is illustrated under moderate sample sizes. Finally, we apply the new estimator to a well-known real dataset.
Sensing charged macromolecules with nanocrystalline diamond-based field-effect capacitive sensors
(2008)
A multi-spot light-addressable potentiometric sensor (LAPS), which belongs to the family of semiconductor field-effect devices, was applied for label-free detection of double-stranded deoxyribonucleic acid (dsDNA) molecules by their intrinsic molecular charge. To reduce the distance between the DNA charge and sensor surface and thus, to enhance the electrostatic coupling between the dsDNA molecules and the LAPS, the negatively charged dsDNA molecules were electrostatically adsorbed onto the gate surface of the LAPS covered with a positively charged weak polyelectrolyte layer of PAH (poly(allylamine hydrochloride)). The surface potential changes in each spot of the LAPS, induced by the layer-by-layer adsorption of a PAH/dsDNA bilayer, were recorded by means of photocurrent-voltage and constant-photocurrent measurements. In addition, the surface morphology of the gate surface before and after consecutive electrostatic adsorption of PAH and dsDNA layers was studied by atomic force microscopy measurements. Moreover, fluorescence microscopy was used to verify the successful adsorption of dsDNA molecules onto the PAH-modified LAPS surface. A high sensor signal of 25 mV was registered after adsorption of 10 nM dsDNA molecules. The lower detection limit is down to 0.1 nM dsDNA. The obtained results demonstrate that the PAH-modified LAPS device provides a convenient and rapid platform for the direct label-free electrical detection of in-solution hybridized dsDNA molecules.
Sensitive and rapid detection of cholera toxin subunit B using magnetic frequency mixing detection
(2019)
Cholera is a life-threatening disease caused by the cholera toxin (CT) as produced by some Vibrio cholerae serogroups. In this research we present a method which directly detects the toxin’s B subunit (CTB) in drinking water. For this purpose we performed a magnetic sandwich immunoassay inside a 3D immunofiltration column. We used two different commercially available antibodies to capture CTB and for binding to superparamagnetic beads. ELISA experiments were performed to select the antibody combination. The beads act as labels for the magnetic frequency mixing detection technique. We show that the limit of detection depends on the type of magnetic beads. A nonlinear Hill curve was fitted to the calibration measurements by means of a custom-written python software. We achieved a sensitive and rapid detection of CTB within a broad concentration range from 0.2 ng/ml to more
than 700 ng/ml.
Sensitivity Analysis of General Aviation Aircraft with Parallel Hybrid-Electric Propulsion Systems
(2019)
Sensitivity of and Influences on the Reliability of an HTR-Module Primary Circuit Pressure Boundary
(1993)
Sensitivity of phase detection techniques in aerated chute flows to hydraulic design parameters
(2012)
Digital elevation models (DEMs), represent the three-dimensional terrain and are the basic input for numerical snow avalanche dynamics simulations. DEMs can be acquired using topographic maps or remote-sensing technologies, such as photogrammetry or lidar. Depending on the acquisition technique, different spatial resolutions and qualities are achieved. However, there is a lack of studies that investigate the sensitivity of snow avalanche simulation algorithms to the quality and resolution of DEMs. Here, we perform calculations using the numerical avalance dynamics model RAMMS, varying the quality and spatial resolution of the underlying DEMs, while holding the simulation parameters constant. We study both channelized and open-terrain avalanche tracks with variable roughness. To quantify the variance of these simulations, we use well-documented large-scale avalanche events from Davos, Switzerland (winter 2007/08), and from our large-scale avalanche test site, Valĺee de la Sionne (winter 2005/06). We find that the DEM resolution and quality is critical for modeled flow paths, run-out distances, deposits, velocities and impact pressures. Although a spatial resolution of ~25 m is sufficient for large-scale avalanche modeling, the DEM datasets must be checked carefully for anomalies and artifacts before using them for dynamics calculations.
Sensitivity of turbulent Schmidt number and turbulence model to simulations of jets in crossflow
(2016)
Environmental discharges have been traditionally designed by means of cost-intensive and time-consuming experimental studies. Some extensively validated models based on an integral approach have been often employed for water quality problems, as recommended by USEPA (i.e.: CORMIX). In this study, FLOW-3D is employed for a full 3D RANS modelling of two turbulent jet-to-crossflow cases, including free surface jet impingement. Results are compared to both physical modelling and CORMIX to better assess model performance. Turbulence measurements have been collected for a better understanding of turbulent diffusion's parameter sensitivity. Although both studied models are generally able to reproduce jet trajectory, jet separation downstream of the impingement has been reproduced only by RANS modelling. Additionally, concentrations are better reproduced by FLOW-3D when the proper turbulent Schmidt number is used. This study provides a recommendation on the selection of the turbulence model and the turbulent Schmidt number for future outfall structures design studies.
Sensor positioning and thermal model for condition monitoring of pressure gas reservoirs in vehicles
(2018)
Existing residential buildings have an average lifetime of 100 years. Many of these buildings will exist for at least another 50 years. To increase the efficiency of these buildings while keeping costs at reasonable rates, they can be retrofitted with sensors that deliver information to central control units for heating, ventilation and electricity. This retrofitting process should happen with minimal intervention into existing infrastructure and requires new approaches for sensor design and data transmission. At FH Aachen University of Applied Sciences, students of different disciplines work together to learn how to design, build, deploy and operate such sensors. The presented teaching project already created a low power design for a combined CO2, temperature and humidity measurement device that can be easily integrated into most home automation systems
A sensor system for investigating (bio)degradationprocesses of polymers is presented. The system utilizes semiconductor field-effect sensors and is capable of monitoring the degradation process in-situ and in real-time. The degradation of the polymer poly(d,l-lactic acid) is exemplarily monitored in solutions with different pH value, pH-buffer solution containing the model enzyme lipase from Rhizomucormiehei and cell-culture medium containing supernatants from stimulated and non-stimulated THP-1-derived macrophages mimicking activation of the immune system.
Under DLR-contract, Giessen University and DLR Cologne are studying solar-electric propulsion missions (SEP) to the outer regions of the solar system. The most challenging reference mission concerns the transport of a 1.35-tons chemical lander spacecraft into an 80-RJ circular orbit around Jupiter, which would enable to place a 375 kg lander with 50 kg of scientific instruments on the surface of the icy moon "Europa". Thorough analyses show that the best solution in terms of SEP launch mass times thrusting time would be a two-stage EP module and a triple-junction solar array with concentrators which would be deployed step by step. Mission performance optimizations suggest to propel the spacecraft in the first EP stage by 6 gridded ion thrusters, running at 4.0 kV of beam voltage, which would save launch mass, and in the second stage by 4 thrusters with 1.25 to 1.5 kV of positive high voltage saving thrusting time. In this way, the launch mass of the spacecraft would be kept within 5.3 tons. Without a launcher's C3 and interplanetary gravity assists, Jupiter might be reached within about 4 yrs. The spiraling-down into the parking orbit would need another 1.8 yrs. This "large mission" can be scaled down to a smaller one, e.g., by halving all masses, the solar array power, and the number of thrusters. Due to their reliability, long lifetime and easy control, RIT-22 engines have been chosen for mission analysis. Based on precise tests, the thruster performance has been modeled.
By DLR-contact, sample return missions to the large main-belt asteroid “19, Fortuna” have been studied. The mission scenario has been based on three ion thrusters of the RIT-22 model, which is presently under space qualification, and on solar arrays equipped with triple-junction GaAs solar cells. After having designed the spacecraft, the orbit-to-orbit trajectories for both, a one-way SEP mission with a chemical sample return and an all-SEP return mission, have been optimized using a combination of artificial neural networks with evolutionary algorithms. Additionally, body-to-body trajectories have been
investigated within a launch period between 2012 and 2015. For orbit-to-orbit calculation, the launch masses of the hybrid mission and of the all-SEP mission resulted in 2.05 tons and 1.56 tons, respectively, including a scientific payload of 246 kg. For the related transfer
durations 4.14 yrs and 4.62 yrs were obtained. Finally, a comparison between the mission scenarios based on SEP and on NEP have been carried out favouring clearly SEP.
In der Reihe der nachwachsenden Rohstoffe besitzt Holz als erneuerbare und umweltfreundliche Ressource ein großes Potenzial. Über 11 Mio. ha Holz, das laut der Fachagentur für nachwachsende Rohstoffe (FNR) auch für industrielle Zwecke genutzt werden kann, wuchsen im Jahr 2013 allein auf bundesdeutscher Fläche. 56,8 Mio. m³ jährlicher Holzeinschlag in den letzten zehn Jahren wurde zu knapp der Hälfte stofflich und der Rest energetisch verwertet. Im Rahmen dieser Arbeit konnte auf der Basis vom Holz der Buche, die nach Fichte und Kiefer die dritthäufigste Baumart in Deutschland ist und 15% der deutschen Waldfläche ausmacht, die Fraktionierung der polymeren Hauptbestandteile mit niedrigem energetischen Einsatz erreicht werden. Hierbei werden in einem nachgeschalteten Extraktionsprozess die beiden Komponenten Hemicellulose und Lignin in flüssiger Form von der finalen festen Cellulosefraktion abgetrennt. Die Extraktion der Hemicellulose erfolgt durch eine Liquid Hot Water (LHW)-Behandlung. Untersucht wird der katalytische Zusatz anorganischer Säuren wie H₃PO₄ und H₂SO₄. Im Hinblick auf die weitere Verwertung von Lignin zu aromatischen Synthesebausteinen kommt die Organosolv-Extraktion mit einem Ethanol/Wasser-Gemisch zum Einsatz. Von Vorteil ist die weitere Verwendung beider Stoffströme ohne Fällungsschritt und nachteiliger Verdünnung der Hemicellulose.
Series production and testing of a micro motor. Serienfertigung und Prüfung eines Mikromotors
(1998)
The potential of electronic markets in enabling innovative product bundles through flexible and sustainable partnerships is not yet fully exploited in the telecommunication industry. One reason is that bundling requires seamless de-assembling and re-assembling of business processes, whilst processes in telecommunication companies are often product-dependent and hard to virtualize. We propose a framework for the planning of the virtualization of processes, intended to assist the decision maker in prioritizing the processes to be virtualized: (a) we transfer the virtualization pre-requisites stated by the Process Virtualization Theory in the context of customer-oriented processes in the telecommunication industry and assess their importance in this context, (b) we derive IT-oriented requirements for the removal of virtualization barriers and highlight their demand on changes at different levels of the organization. We present a first evaluation of our approach in a case study and report on lessons learned and further steps to be performed.
Shakedown analysis of Reissner-Mindlin plates using the edge-based smoothed finite element method
(2014)
This paper concerns the development of a primal-dual algorithm for limit and shakedown analysis of Reissner-Mindlin plates made of von Mises material. At each optimization iteration, the lower bound of the shakedown load multiplier is calculated simultaneously with the upper bound using the duality theory. An edge-based smoothed finite element method (ES-FEM) combined with the discrete shear gap (DSG) technique is used to improve the accuracy of the solutions and to avoid the transverse shear locking behaviour. The method not only possesses all inherent features of convergence and accuracy from ES-FEM, but also ensures that the total number of variables in the optimization problem is kept to a minimum compared with the standard finite element formulation. Numerical examples are presented to demonstrate the effectiveness of the present method.
Shakedown analysis of two dimensional structures by an edge-based smoothed finite element method
(2010)
In this paper we propose a stochastic programming method to analyse limit and shakedown of structures under uncertainty condition of strength. Based on the duality theory, the shakedown load multiplier formulated by the kinematic theorem is proved actually to be the dual form of the shakedown load multiplier formulated by static theorem. In this investigation a dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit and the shakedown limit. The edge-based smoothed finite element method (ES-FEM) with three-node linear triangular elements is used for structural analysis.
This article addresses the need for an innovative technique in plasma shaping, utilizing antenna structures, Maxwell’s laws, and boundary conditions within a shielded environment. The motivation lies in exploring a novel approach to efficiently generate high-energy density plasma with potential applications across various fields. Implemented in an E01 circular cavity resonator, the proposed method involves the use of an impedance and field matching device with a coaxial connector and a specially optimized monopole antenna. This setup feeds a low-loss cavity resonator, resulting in a high-energy density air plasma with a surface temperature exceeding 3500 o C, achieved with a minimal power input of 80 W. The argon plasma, resembling the shape of a simple monopole antenna with modeled complex dielectric values, offers a more energy-efficient alternative compared to traditional, power-intensive plasma shaping methods. Simulations using a commercial electromagnetic (EM) solver validate the design’s effectiveness, while experimental validation underscores the method’s feasibility and practical implementation. Analyzing various parameters in an argon atmosphere, including hot S -parameters and plasma beam images, the results demonstrate the successful application of this technique, suggesting its potential in coating, furnace technology, fusion, and spectroscopy applications.
SHEMAT-Suite: An open-source code for simulating flow, heat and species transport in porous media
(2020)
SHEMAT-Suite is a finite-difference open-source code for simulating coupled flow, heat and species transport in porous media. The code, written in Fortran-95, originates from geoscientific research in the fields of geothermics and hydrogeology. It comprises: (1) a versatile handling of input and output, (2) a modular framework for subsurface parameter modeling, (3) a multi-level OpenMP parallelization, (4) parameter estimation and data assimilation by stochastic approaches (Monte Carlo, Ensemble Kalman filter) and by deterministic Bayesian approaches based on automatic differentiation for calculating exact (truncation error-free) derivatives of the forward code.
Shielding effectiveness of reinforced concrete cable ducts carrying partial lightning currents
(1998)
Shielding gas influences on laser weldability of tailored blanks of advanced automotive steels
(2010)
The effects of shielding gas types and flow rates on CO2 laser weldability of DP600/TRIP700 steel sheets were studied in this work. The evaluated shielding gases were helium (He), argon (Ar) and different mixtures of He and Ar. Weld penetration, tensile strength and formability (Erichsen test) of laser welds were found to be strongly dependent upon the shielding gas types. The ability of shielding gas in removing plasma plume and thus increasing weld penetration is believed to be closely related to ionization potential and atomic weight which determine the period of plasma formation and disappearance. It was found that the higher helium shielding gas flow rate, the deeper weld penetration and the lower weld width.
Short term effects of magnetic resonance imaging on excitability of the motor cortex at 1.5T and 7T
(2010)
Rationale and Objectives
The increasing spread of high-field and ultra-high-field magnetic resonance imaging (MRI) scanners has encouraged new discussion of the safety aspects of MRI. Few studies have been published on possible cognitive effects of MRI examinations. The aim of this study was to examine whether changes are measurable after MRI examinations at 1.5 and 7 T by means of transcranial magnetic stimulation (TMS).
Materials and Methods
TMS was performed in 12 healthy, right-handed male volunteers. First the individual motor threshold was specified, and then the cortical silent period (SP) was measured. Subsequently, the volunteers were exposed to the 1.5-T MRI scanner for 63 minutes using standard sequences. The MRI examination was immediately followed by another TMS session. Fifteen minutes later, TMS was repeated. Four weeks later, the complete setting was repeated using a 7-T scanner. Control conditions included lying in the 1.5-T scanner for 63 minutes without scanning and lying in a separate room for 63 minutes. TMS was performed in the same way in each case. For statistical analysis, Wilcoxon's rank test was performed.
Results
Immediately after MRI exposure, the SP was highly significantly prolonged in all 12 subjects at 1.5 and 7 T. The motor threshold was significantly increased. Fifteen minutes after the examination, the measured value tended toward normal again. Control conditions revealed no significant differences.
Conclusion
MRI examinations lead to a transient and highly significant alteration in cortical excitability. This effect does not seem to depend on the strength of the static magnetic field.
Treatment of posttraumatic osteoarthritis of the radial column of the elbow joint remains a challenging yet common issue.
While partial joint replacement leads to high revision rates, radial head excision has shown to severely increase joint instability. Shortening osteotomy of the radius could be an option to decrease the contact pressure of the radiohumeral joint and thereby pain levels without causing valgus instability. Hence, the aim of this biomechanical study was to evaluate the effects of radial shortening on axial load distribution and valgus stability of the elbow joint.
Side bands in ¹⁷² Hf
(1978)
Side bands in ¹⁷² Hf
(1977)
Side bands in ¹⁷² Hf
(1978)
Side-bands in ¹⁸⁰ Os
(1981)
Digital Image Correlation (DIC) is a powerful tool used to evaluate displacements and deformations in a non-intrusive manner. By comparing two images, one of the undeformed reference state of a specimen and another of the deformed target state, the relative displacement between those two states is determined. DIC is well known and often used for post-processing analysis of in-plane displacements and deformation of specimen. Increasing the analysis speed to enable real-time DIC analysis will be beneficial and extend the field of use of this technique.
Here we tested several combinations of the most common DIC methods in combination with different parallelization approaches in MATLAB and evaluated their performance to determine whether real-time analysis is possible with these methods. To reflect improvements in computing technology different hardware settings were also analysed. We found that implementation problems can reduce the efficiency of a theoretically superior algorithm such that it becomes practically slower than a suboptimal algorithm. The Newton-Raphson algorithm in combination with a modified Particle Swarm algorithm in parallel image computation was found to be most effective. This is contrary to theory, suggesting that the inverse-compositional Gauss-Newton algorithm is superior. As expected, the Brute Force Search algorithm is the least effective method. We also found that the correct choice of parallelization tasks is crucial to achieve improvements in computing speed. A poorly chosen parallelisation approach with high parallel overhead leads to inferior performance. Finally, irrespective of the computing mode the correct choice of combinations of integerpixel and sub-pixel search algorithms is decisive for an efficient analysis. Using currently available hardware realtime analysis at high framerates remains an aspiration.
In this study, a recently proposed NMR standardization approach by 2H integral of deuterated solvent for quantitative multicomponent analysis of complex mixtures is presented. As a proof of principle, the existing NMR routine for the analysis of Aloe vera products was modified. Instead of using absolute integrals of targeted compounds and internal standard (nicotinamide) from 1H-NMR spectra, quantification was performed based on the ratio of a particular 1H-NMR compound integral and 2H-NMR signal of deuterated solvent D2O. Validation characteristics (linearity, repeatability, accuracy) were evaluated and the results showed that the method has the same precision as internal standardization in case of multicomponent screening. Moreover, a dehydration process by freeze drying is not necessary for the new routine. Now, our NMR profiling of A. vera products needs only limited sample preparation and data processing. The new standardization methodology provides an appealing alternative for multicomponent NMR screening. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and is recommended in different application areas (purity determination, forensics, pharmaceutical analysis, etc.).
The Volatility Framework is a collection of tools for the analysis of computer RAM. The framework offers a multitude of analysis options and is used by many investigators worldwide. Volatility currently comes with a command line interface only, which might be a hinderer for some investigators to use the tool. In this paper we present a GUI and extensions for the Volatility Framework, which on the one hand simplify the usage of the tool and on the other hand offer additional functionality like storage of results in a database, shortcuts for long Volatility Framework command sequences, and entirely new commands based on correlation of data stored in the database.
Simulating the electromagnetic‐thermal treatment of thin aluminium layers for adhesion improvement
(2015)
A composite layer material used in packaging industry is made from joining layers of different materials using an adhesive. An important processing step in the production of aluminium-containing composites is the surface treatment and consequent coating of adhesive material on the aluminium surface. To increase adhesion strength between aluminium layer and the adhesive material, the foil is heat treated. For efficient heating, induction heating was considered as state-of-the-art treatment process. Due to the complexity of the heating process and the unpredictable nature of the heating source, the control of the process is not yet optimised. In this work, a finite element analysis of the process was established and various process parameters were studied. The process was simplified and modelled in 3D. The numerical model contains an air domain, an aluminium layer and a copper coil fitted with a magnetic field concentrating material. The effect of changing the speed of the aluminium foil (or rolling speed) was studied with the change of the coil current. Statistical analysis was used for generating a general control equation of coil current with changing rolling speed.
Simulation and measurement of melting effects on metal sheets caused by direct lightning strikes
(1991)
Simulation model for the transient process behaviour of solar aluminium recycling in a rotary kiln
(2015)
We present an electromechanically coupled Finite Element model for cardiac tissue. It bases on the mechanical model for cardiac tissue of Hunter et al. that we couple to the McAllister-Noble-Tsien electrophysiological model of purkinje fibre cells. The corresponding system of ordinary differential equations is implemented on the level of the constitutive equations in a geometrically and physically nonlinear version of the so-called edge-based smoothed FEM for plates. Mechanical material parameters are determined from our own pressure-deflection experimental setup. The main purpose of the model is to further examine the experimental results not only on mechanical but also on electrophysiological level down to ion channel gates. Moreover, we present first drug treatment simulations and validate the model with respect to the experiments.