Refine
Year of publication
Document Type
- Article (1299)
- Conference Proceeding (131)
- Book (43)
- Part of a Book (40)
- Doctoral Thesis (18)
- Other (5)
- Patent (4)
- Preprint (3)
- Habilitation (1)
- Talk (1)
Language
- English (1545) (remove)
Has Fulltext
- no (1545) (remove)
Keywords
- LAPS (4)
- Natural language processing (4)
- CellDrum (3)
- Field-effect sensor (3)
- Light-addressable potentiometric sensor (3)
- Paired sample (3)
- hydrogen peroxide (3)
- Bacillus atrophaeus (2)
- Biocomposites (2)
- Clustering (2)
- Empirical process (2)
- Force (2)
- Goodness-of-fit test (2)
- Incomplete data (2)
- Independence test (2)
- Information extraction (2)
- Iterative learning control (2)
- Limit analysis (2)
- Machine learning (2)
- Natural fibres (2)
Institute
- Fachbereich Medizintechnik und Technomathematik (1545) (remove)
Shakedown analysis of Reissner-Mindlin plates using the edge-based smoothed finite element method
(2014)
This paper concerns the development of a primal-dual algorithm for limit and shakedown analysis of Reissner-Mindlin plates made of von Mises material. At each optimization iteration, the lower bound of the shakedown load multiplier is calculated simultaneously with the upper bound using the duality theory. An edge-based smoothed finite element method (ES-FEM) combined with the discrete shear gap (DSG) technique is used to improve the accuracy of the solutions and to avoid the transverse shear locking behaviour. The method not only possesses all inherent features of convergence and accuracy from ES-FEM, but also ensures that the total number of variables in the optimization problem is kept to a minimum compared with the standard finite element formulation. Numerical examples are presented to demonstrate the effectiveness of the present method.
Shakedown analysis of two dimensional structures by an edge-based smoothed finite element method
(2010)
In this paper we propose a stochastic programming method to analyse limit and shakedown of structures under uncertainty condition of strength. Based on the duality theory, the shakedown load multiplier formulated by the kinematic theorem is proved actually to be the dual form of the shakedown load multiplier formulated by static theorem. In this investigation a dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit and the shakedown limit. The edge-based smoothed finite element method (ES-FEM) with three-node linear triangular elements is used for structural analysis.
Treatment of posttraumatic osteoarthritis of the radial column of the elbow joint remains a challenging yet common issue.
While partial joint replacement leads to high revision rates, radial head excision has shown to severely increase joint instability. Shortening osteotomy of the radius could be an option to decrease the contact pressure of the radiohumeral joint and thereby pain levels without causing valgus instability. Hence, the aim of this biomechanical study was to evaluate the effects of radial shortening on axial load distribution and valgus stability of the elbow joint.
Side bands in ¹⁷² Hf
(1978)
Side bands in ¹⁷² Hf
(1977)
Side bands in ¹⁷² Hf
(1978)
Side-bands in ¹⁸⁰ Os
(1981)
Simulating the electromagnetic‐thermal treatment of thin aluminium layers for adhesion improvement
(2015)
A composite layer material used in packaging industry is made from joining layers of different materials using an adhesive. An important processing step in the production of aluminium-containing composites is the surface treatment and consequent coating of adhesive material on the aluminium surface. To increase adhesion strength between aluminium layer and the adhesive material, the foil is heat treated. For efficient heating, induction heating was considered as state-of-the-art treatment process. Due to the complexity of the heating process and the unpredictable nature of the heating source, the control of the process is not yet optimised. In this work, a finite element analysis of the process was established and various process parameters were studied. The process was simplified and modelled in 3D. The numerical model contains an air domain, an aluminium layer and a copper coil fitted with a magnetic field concentrating material. The effect of changing the speed of the aluminium foil (or rolling speed) was studied with the change of the coil current. Statistical analysis was used for generating a general control equation of coil current with changing rolling speed.
We present an electromechanically coupled Finite Element model for cardiac tissue. It bases on the mechanical model for cardiac tissue of Hunter et al. that we couple to the McAllister-Noble-Tsien electrophysiological model of purkinje fibre cells. The corresponding system of ordinary differential equations is implemented on the level of the constitutive equations in a geometrically and physically nonlinear version of the so-called edge-based smoothed FEM for plates. Mechanical material parameters are determined from our own pressure-deflection experimental setup. The main purpose of the model is to further examine the experimental results not only on mechanical but also on electrophysiological level down to ion channel gates. Moreover, we present first drug treatment simulations and validate the model with respect to the experiments.
The interest in PET detectors with monolithic block scintillators is growing. In order to obtain high spatial resolutions dedicated positioning algorithms are required. But even an ideal algorithm can only deliver information which is provided by the detector. In this simulation study we investigated the light distribution on one surface of cuboid LSO scintillators of different size. Scintillators with a large aspect ratio (small footprint and large height) showed significant position information only for a minimum interaction depth of the gamma particle. The results allow a quantitative estimate for a useful aspect ratio.
In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm.
Simultaneous detection of cyanide and heavy metals for environmental analysis by means of µISEs
(2010)
Superparamagnetic iron oxide nanoparticles (SPION) are extensively used for magnetic resonance imaging (MRI) and magnetic particle imaging (MPI), as well as for magnetic fluid hyperthermia (MFH). We here describe a sequential centrifugation protocol to obtain SPION with well-defined sizes from a polydisperse SPION starting formulation, synthesized using the routinely employed co-precipitation technique. Transmission electron microscopy, dynamic light scattering and nanoparticle tracking analyses show that the SPION fractions obtained upon size-isolation are well-defined and almost monodisperse. MRI, MPI and MFH analyses demonstrate improved imaging and hyperthermia performance for size-isolated SPION as compared to the polydisperse starting mixture, as well as to commercial and clinically used iron oxide nanoparticle formulations, such as Resovist® and Sinerem®. The size-isolation protocol presented here may help to identify SPION with optimal properties for diagnostic, therapeutic and theranostic applications.
Socio-technical scenarios for energy-intensive industries: the future of steel production in Germany
(2019)
Soft Materials in Technology and Biology – Characteristics, Properties, and Parameter Identification
(2008)
At the present time, one of the most serious environmental problems of Central Asia and South Kazakhstan is the ongoing large-scale deterioration of principal urban tree populations. Several major centers of massive spread of invasive plant pests have been found in urban dendroflora of this region. The degree of damage of seven most wide-spread aboriginal tree species was found to range from 21.4±1.1 to 85.4±1.8%. In particular, the integrity of the native communities of sycamore (Platanus spp.), willow (Salix spp.), poplar (Populus spp.) and elm (Ulmus spp.) is highly endangered. Our taxonomic analysis of the most dangerous tree pests of the region has revealed them as neobiontic xylophilous insects such as Cossus cossus L. (Order: Lepidoptera L.) Monochamus urussovi Fisch., Monochamus sutor L., Acanthocinus aedelis L. and Ñetonia aureate L. (Order: Coleoptera L.). We relate the origin of this threatening trend with the import of industrial wood in the mid 90s of the last century that was associated with high degree of the constructional work in the region. Because of the absence of efficient natural predators of the pest species, the application of microbiological methods of the pest control and limitation is suggested.
In this work, a spore-based biosensor is evaluated to monitor the microbicidal efficacy of sterilization processes applying gaseous hydrogen peroxide (H2O2). The sensor is based on interdigitated electrode structures (IDEs) that have been fabricated by means of thin-film technologies. Impedimetric measurements are applied to study the effect of sterilization process on spores of Bacillus atrophaeus. This resilient microorganism is commonly used in industry to proof the sterilization efficiency. The sensor measurements are accompanied by conventional microbiological challenge tests, as well as morphological characterizations with scanning electron microscopy (SEM) and transmission electron microscopy (TEM). The sensor measurements are correlated with the microbiological test routines. In both methods, namely the sensor-based and microbiological one, a tailing effect has been observed. The results are evaluated and discussed in a three-dimensional calibration plot demonstrating the sensor's suitability to enable a rapid process decision in terms of a successfully performed sterilization.
The progress in natural language processing (NLP) research over the last years, offers novel business opportunities for companies, as automated user interaction or improved data analysis. Building sophisticated NLP applications requires dealing with modern machine learning (ML) technologies, which impedes enterprises from establishing successful NLP projects. Our experience in applied NLP research projects shows that the continuous integration of research prototypes in production-like environments with quality assurance builds trust in the software and shows convenience and usefulness regarding the business goal. We introduce STAMP 4 NLP as an iterative and incremental process model for developing NLP applications. With STAMP 4 NLP, we merge software engineering principles with best practices from data science. Instantiating our process model allows efficiently creating prototypes by utilizing templates, conventions, and implementations, enabling developers and data scientists to focus on the business goals. Due to our iterative-incremental approach, businesses can deploy an enhanced version of the prototype to their software environment after every iteration, maximizing potential business value and trust early and avoiding the cost of successful yet never deployed experiments.