Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (218) (remove)
Language
- English (218) (remove)
Document Type
- Conference Proceeding (218) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Clusterion (4)
- Limit analysis (4)
- Natural language processing (4)
- Air purification (3)
- Einspielen <Werkstoff> (3)
Recognition of subjects with mild cognitive impairment (MCI) by the use of retinal arterial vessels.
(2019)
An application of a scanning light-addressable potentiometric sensor for label-free DNA detection
(2013)
DNA-hybridization detection using light-addressable potentiometric sensor modified with gold layer
(2014)
A capacitive electrolyte-insulator-semiconductor (EISCAP) biosensor modified with Tobacco mosaic virus (TMV) particles for the detection of acetoin is presented. The enzyme acetoin reductase (AR) was immobilized on the surface of the EISCAP using TMV particles as nanoscaffolds. The study focused on the optimization of the TMV-assisted AR immobilization on the Ta 2 O 5 -gate EISCAP surface. The TMV-assisted acetoin EISCAPs were electrochemically characterized by means of leakage-current, capacitance-voltage, and constant-capacitance measurements. The TMV-modified transducer surface was studied via scanning electron microscopy.
We report on the synthesis and CO gas-sensing properties of mesoporous tin(IV) oxides (SnO2). For the synthesis cetyltrimethylammonium bromide (CTABr) was used as a structure-directing agent; the resulting SnO2 powders were applied as films to commercially available sensor substrates by drop coating. Nitrogen physisorption shows specific surface areas up to 160 m2·g-1 and mean pore diameters of about 4 nm, as verified by TEM. The film conductance was measured in dependence on the CO concentration in humid synthetic air at a constant temperature of 300 °C. The sensors show a high sensitivity at low CO concentrations and turn out to be largely insensitive towards changes in the relative humidity. We compare the materials with commercially available SnO2-based sensors.
Various planar technologies are employed for developing solid-state sensors having low cost, small size and high reproducibility; thin- and thick-film technologies are most suitable for such productions. Screen-printing is especially suitable due to its simplicity, low-cost, high reproducibility and efficiency in large-scale production. This technology enables the deposition of a thick layer and allows precise pattern control. Moreover, this is a highly economic technology, saving large amounts of the used inks. In the course of repetitions of the film-deposition procedure there is no waste of material due to additivity of this thick-film technology. Finally, the thick films can be easily and quickly deposited on inexpensive substrates. In this contribution, thick-film ion-selective electrodes based on ionophores as well as crystalline ion-selective materials dedicated for potentiometric measurements are demonstrated. Analytical parameters of these sensors are comparable with those reported for conventional potentiometric electrodes. All mentioned thick-film strip electrodes have been totally fabricated in only one, fully automated thickfilm technology, without any additional manual, chemical or electrochemical steps. In all cases simple, inexpensive, commercially available materials, i.e. flexible, plastic substrates and easily cured polymer-based pastes were used.
A procedure for the evaluation of the failure probability of elastic-plastic thin shell structures is presented. The procedure involves a deterministic limit and shakedown analysis for each probabilistic iteration which is based on the kinematical approach and the use the exact Ilyushin yield surface. Based on a direct definition of the limit state function, the non-linear problems may be efficiently solved by using the First and Second Order Reliabiblity Methods (Form/SORM). This direct approach reduces considerably the necessary knowledge of uncertain technological input data, computing costs and the numerical error. In: Computational plasticity / ed. by Eugenio Onate. Dordrecht: Springer 2007. VII, 265 S. (Computational Methods in Applied Sciences ; 7) (COMPLAS IX. Part 1 . International Center for Numerical Methods in Engineering (CIMNE)). ISBN 978-1-402-06576-7 S. 186-189
Proceedings of the International Conference on Material Theory and Nonlinear Dynamics. MatDyn. Hanoi, Vietnam, Sept. 24-26, 2007, 8 p. In this paper, a method is introduced to determine the limit load of general shells using the finite element method. The method is based on an upper bound limit and shakedown analysis with elastic-perfectly plastic material model. A non-linear constrained optimisation problem is solved by using Newton’s method in conjunction with a penalty method and the Lagrangean dual method. Numerical investigation of a pipe bend subjected to bending moments proves the effectiveness of the algorithm.
Shakedown analysis of two dimensional structures by an edge-based smoothed finite element method
(2010)
Summary: This paper presents a methodology to study and understand the mechanics of stapled anastomotic behaviors by combining empirical experimentation and finite element analysis. Performance of stapled anastomosis is studied in terms of leakage and numerical results which are compared to in vitro experiments performed on fresh porcine tissue. Results suggest that leaks occur between the tissue and staple legs penetrating through the tissue.
Direct methods comprising limit and shakedown analysis is a branch of computational mechanics. It plays a significant role in mechanical and civil engineering design. The concept of direct method aims to determinate the ultimate load bearing capacity of structures beyond the elastic range. For practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and onstraints. If strength and loading are random quantities, the problem of shakedown analysis is considered as stochastic programming. This paper presents a method so called chance constrained programming, an effective method of stochastic programming, to solve shakedown analysis problem under random condition of strength. In this our investigation, the loading is deterministic, the strength is distributed as normal or lognormal variables.
A new formulation to calculate the shakedown limit load of Kirchhoff plates under stochastic conditions of strength is developed. Direct structural reliability design by chance con-strained programming is based on the prescribed failure probabilities, which is an effective approach of stochastic programming if it can be formulated as an equivalent deterministic optimization problem. We restrict uncertainty to strength, the loading is still deterministic. A new formulation is derived in case of random strength with lognormal distribution. Upper bound and lower bound shakedown load factors are calculated simultaneously by a dual algorithm.
We propose a stochastic programming method to analyse limit and shakedown of structures under random strength with lognormal distribution. In this investigation a dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit or the shakedown limit. The edge-based smoothed finite element method (ES-FEM) using three-node linear triangular elements is used.
Design and implementation aspects of a 3D reconstruction algorithm for the Jülich TierPET system
(1997)
The interest in PET detectors with monolithic block scintillators is growing. In order to obtain high spatial resolutions dedicated positioning algorithms are required. But even an ideal algorithm can only deliver information which is provided by the detector. In this simulation study we investigated the light distribution on one surface of cuboid LSO scintillators of different size. Scintillators with a large aspect ratio (small footprint and large height) showed significant position information only for a minimum interaction depth of the gamma particle. The results allow a quantitative estimate for a useful aspect ratio.
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Limit and shakedown theorems are exact theories of classical plasticity for the direct computation of safety factors or of the load carrying capacity under constant and varying loads. Simple versions of limit and shakedown analysis are the basis of all design codes for pressure vessels and pipings. Using Finite Element Methods more realistic modeling can be used for a more rational design. The methods can be extended to yield optimum plastic design. In this paper we present a first implementation in FE of limit and shakedown analyses for perfectly plastic material. Limit and shakedown analyses are done of a pipe–junction and a interaction diagram is calculated. The results are in good correspondence with the analytic solution we give in the appendix.
Safety and reliability of structures may be assessed indirectly by stress distributions. Limit and shakedown theorems are simplified but exact methods of plasticity that provide safety factors directly in the loading space. These theorems may be used for a direct definition of the limit state function for failure by plastic collapse or by inadaptation. In a FEM formulation the limit state function is obtained from a nonlinear optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error. Moreover, the direct way leads to highly effective and precise reliability analyses. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis.
Structural design analyses are conducted with the aim of verifying the exclusion of ratcheting. To this end it is important to make a clear distinction between the shakedown range and the ratcheting range. In cyclic plasticity more sophisticated hardening models have been suggested in order to model the strain evolution observed in ratcheting experiments. The hardening models used in shakedown analysis are comparatively simple. It is shown that shakedown analysis can make quite stable predictions of admissible load ranges despite the simplicity of the underlying hardening models. A linear and a nonlinear kinematic hardening model of two-surface plasticity are compared in material shakedown analysis. Both give identical or similar shakedown ranges. Structural shakedown analyses show that the loading may have a more pronounced effect than the hardening model.
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.
The nonlinear scalar constitutive equations of gases lead to a change in sound speed from point to point as would be found in linear inhomogeneous (and time dependent) media. The nonlinear tensor constitutive equations of solids introduce the additional local effect of solution dependent anisotropy. The speed of a wave passing through a point changes with propagation direction and its rays are inclined to the front. It is an open question whether the widely used operator splitting techniques achieve a dimensional splitting with physically reasonable results for these multi-dimensional problems. May be this is the main reason why the theoretical and numerical investigations of multi-dimensional wave propagation in nonlinear solids are so far behind gas dynamics. We hope to promote the subject a little by a discussion of some fundamental aspects of the solution of the equations of nonlinear elastodynamics. We use methods of characteristics because they only integrate mathematically exact equations which have a direct physical interpretation.
This paper presents the direct route to Design by Analysis (DBA) of the new European pressure vessel standard in the language of limit and shakedown analysis (LISA). This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for some examples from the DBA-Manual. One observation from the examples is, that the optimisation approach gives reliable and close lower bound solutions leading to simple and optimised design decision.
In: Technical feasibility and reliability of passive safety systems for nuclear power plants. Proceedings of an Advisory Group Meeting held in Jülich, 21-24 November 1994. - Vienna , 1996. - Seite: 43 - 55 IAEA-TECDOC-920 Abstract: It is shown that the difficulty for probabilistic fracture mechanics (PFM) is the general problem of the high reliability of a small population. There is no way around the problem as yet. Therefore what PFM can contribute to the reliability of steel pressure boundaries is demonstrated with the example of a typical reactor pressure vessel and critically discussed. Although no method is distinguishable that could give exact failure probabilities, PFM has several additional chances. Upper limits for failure probability may be obtained together with trends for design and operating conditions. Further, PFM can identify the most sensitive parameters, improved control of which would increase reliability. Thus PFM should play a vital role in the analysis of steel pressure boundaries despite all shortcomings.
Study of swift heavy ion modified conduction polymer composites for application as gas sensor
(2006)
A polyaniline-based conducting composite was prepared by oxidative polymerisation of aniline in a polyvinylchloride (PVC) matrix. The coherent free standing thin films of the composite were prepared by a solution casting method. The polyvinyl chloride-polyaniline composites exposed to 120 MeV ions of silicon with total ion fluence ranging from 1011 to 1013 ions/cm2, were observed to be more sensitive towards ammonia gas than the unirradiated composite. The response time of the irradiated composites was observed to be comparably shorter. We report for the first time the application of swift heavy ion modified insulating polymer conducting polymer (IPCP) composites for sensing of ammonia gas.
Micromachined thermal heater platforms offer low electrical power consumption and high modulation speed, i.e. properties which are advantageous for realizing nondispersive infrared (NDIR) gas- and liquid monitoring systems. In this paper, we report on investigations on silicon-on-insulator (SOI) based infrared (IR) emitter devices heated by employing different kinds of metallic and semiconductor heater materials. Our results clearly reveal the superior high-temperature performance of semiconductor over metallic heater materials. Long-term stable emitter operation in the vicinity of 1300 K could be attained using heavily antimony-doped tin dioxide (SnO2:Sb) heater elements.
Magnetic nanoparticles (MNP) are investigated with great interest for biomedical applications in diagnostics (e.g. imaging: magnetic particle imaging (MPI)), therapeutics (e.g. hyperthermia: magnetic fluid hyperthermia (MFH)) and multi-purpose biosensing (e.g. magnetic immunoassays (MIA)). What all of these applications have in common is that they are based on the unique magnetic relaxation mechanisms of MNP in an alternating magnetic field (AMF). While MFH and MPI are currently the most prominent examples of biomedical applications, here we present results on the relatively new biosensing application of frequency mixing magnetic detection (FMMD) from a simulation perspective. In general, we ask how the key parameters of MNP (core size and magnetic anisotropy) affect the FMMD signal: by varying the core size, we investigate the effect of the magnetic volume per MNP; and by changing the effective magnetic anisotropy, we study the MNPs’ flexibility to leave its preferred magnetization direction. From this, we predict the most effective combination of MNP core size and magnetic anisotropy for maximum signal generation.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
Proceedings of the 2nd Humboldt Kolleg, Hammamet, Tunisia Organizer: Alexander von Humboldt Stiftung, Germany. pdf 184 p. Welcome Address Dear Participants, Welcome to the 2nd Humboldt Kolleg in “Nanoscale Science and Technology” (NS&T’12) in Tunisia, sponsored by the "Alexander von Humboldt" foundation. The NS&T’12 multidisciplinary scientific program includes seven "hot" topics dealing with "Nanoscale Science and Technology" covering basic and application-oriented research as well as industrial (market) aspects: - Molecular Biophyics, Spectroscopy Techniques, Imaging Microscopy - Nanomaterials Synthesis for Medicine and Bio-chemical Sensors - Nanostructures, Semiconductors, Photonics and Nanodevices - New Technologies in Market Industry - Environment, Electro-chemistry, Bio-polymers and Fuel Cells - Nanomaterials, Photovoltaic, Modelling, Quantum Physics - Microelectronics, Sensors Networks and Embedded Systems We are deeply indebted to all members of the Scientific Committee and General Chairs for joint Sessions and to all speakers and chairmen, who have dedicated invaluable time and efforts for the realization of this event. On behalf of the Organizing Committee, we are cordially inviting you to join the conference and hope that your stay will be fruitful, rewarding and enjoyable. Prof. Dr. Michael J. Schöning, Prof. Dr. Adnane Abdelghani