Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (243) (remove)
Document Type
- Conference Proceeding (243) (remove)
Keywords
- Biosensor (25)
- CAD (11)
- Finite-Elemente-Methode (11)
- civil engineering (11)
- Bauingenieurwesen (10)
- Einspielen <Werkstoff> (6)
- shakedown analysis (6)
- Clusterion (4)
- Limit analysis (4)
- Natural language processing (4)
- limit analysis (4)
- Air purification (3)
- Hämoglobin (3)
- Luftreiniger (3)
- Plasmacluster ion technology (3)
- Raumluft (3)
- Shakedown (3)
- Shakedown analysis (3)
- Sonde (3)
- Traglast (3)
- Bruchmechanik (2)
- Clustering (2)
- Einspielanalyse (2)
- Eisschicht (2)
- Erythrozyt (2)
- FEM (2)
- Information extraction (2)
- Kohlenstofffaser (2)
- Lipopolysaccharide (2)
- Ratcheting (2)
- Stickstoffmonoxid (2)
- Traglastanalyse (2)
- biosensor (2)
- celldrum technology (2)
- lipopolysaccharides (2)
- nitric oxide gas (2)
- ratchetting (2)
- shakedown (2)
- 3-nitrofluoranthene (1)
- Active learning (1)
- Adsorption (1)
- Agent-based modeling (1)
- Agent-based simulation (1)
- Analytical models (1)
- Analytischer Zulaessigkeitsnachweis (1)
- Anastomose (1)
- Anastomosis (1)
- Autofluoreszenzverfahren (1)
- BTEX compounds (1)
- Bakterien (1)
- Bio-Sensors (1)
- Biomechanics (1)
- Biomechanik (1)
- Biomedizinische Technik (1)
- Biophoton (1)
- Biosensorik (1)
- Blitzschutz (1)
- CAD ; (1)
- CO (1)
- Chance constrained programming (1)
- Cloud Computing (1)
- Cloud Service Broker (1)
- Comparative simulation (1)
- Conducing polymer (1)
- Database (1)
- Dattel (1)
- Deep learning (1)
- Dekontamination (1)
- Druckbeanspruchung (1)
- Druckbehälter (1)
- Druckbelastung (1)
- ECT (1)
- EEG (1)
- EPN (1)
- Einspiel-Analyse (1)
- Elastodynamik (1)
- Elektrodynamik (1)
- Endothelzelle (1)
- Energy dispatch (1)
- Energy market (1)
- Energy market design (1)
- Evolution of damage (1)
- Exact Ilyushin yield surface (1)
- Extension fracture (1)
- Extension strain criterion (1)
- FEM-Programm (1)
- FEM-computation (1)
- Fehlerstellen (1)
- Festkörper (1)
- Fibroblast (1)
- Finite element method (1)
- First Order Reliabiblity Method (1)
- First-order reliability method (1)
- Fluorescence (1)
- Focusing (1)
- Force (1)
- GaAs hot electron injector (1)
- Gas sensor (1)
- Grid Computing (1)
- Gunn diode (1)
- Heavy metal detection (1)
- High throughput experimentation (1)
- Hotplate (1)
- Human Factors (1)
- Hydrodynamik (1)
- Hydrogel (1)
- Hydrogen sensor (1)
- I3S 2005 (1)
- ISFET (1)
- Impedance Spectroscopy (1)
- Information Extraction (1)
- Information Integration Tools (1)
- Instruments (1)
- International Symposium on Sensor Science (1)
- Iterative learning control (1)
- Knee (1)
- Knowledge Management (1)
- Körpertemperatur (1)
- LED chip (1)
- LISA (1)
- Level sensor (1)
- Lichtstreuungsbasierte Instrumente (1)
- Load modeling (1)
- MEMS (1)
- Machine learning (1)
- Main sensitivity (1)
- Market modeling (1)
- Measurement (1)
- Mechanische Beanspruchung (1)
- Microreactors (1)
- Mohr–Coulomb criterion (1)
- Multi-dimensional wave propagation (1)
- Nano Materials (1)
- Nanomaterial (1)
- Nanoparticles (1)
- Nanopartikel (1)
- Nanostructuring (1)
- Nanotechnologie (1)
- Nanotechnology ; Microelectronics ; Biosensors ; Superconductor ; MEMS (1)
- Natriumhypochlorit (1)
- Natural Language Processing (1)
- Natural language understanding (1)
- Nichtlineare Gleichung (1)
- Nichtlineare Optimierung (1)
- Nichtlineare Welle (1)
- Ontologie <Wissensverarbeitung> (1)
- Ontology Engineering (1)
- Open Data (1)
- Open source (1)
- Organophosphorus (1)
- Ostazine Orange (1)
- PFM (1)
- Pflanzenphysiologie (1)
- Pflanzenscanner (1)
- Phenylalanine determination (1)
- Potentiometry (1)
- Process model (1)
- Profile Extraction (1)
- Profile extraction (1)
- Proteine (1)
- Pseudomonas putida (1)
- Quartz crystal nanobalance (QCN) (1)
- Quartz micro balances (1)
- Query learning (1)
- Random variable (1)
- Reaction-diffusion (1)
- Refining (1)
- Relation classification (1)
- Reliability of structures (1)
- Renewable energy sources (1)
- Reproducible research (1)
- Rohr (1)
- Rohrbruch (1)
- Sensitivity (1)
- Sepsis (1)
- Simulation (1)
- Sleep EEG (1)
- Solid amalgam electrodes (1)
- Stahl (1)
- Stochastic programming (1)
- Supraleiter (1)
- Technische Mechanik (1)
- Text Mining (1)
- Text mining (1)
- Time-series (1)
- Tin oxide (1)
- Tobacco mosaic virus (1)
- Torsion (1)
- Torsionsbelastung (1)
- Tragfähigkeit (1)
- Training (1)
- Trustworthy artificial intelligence (1)
- UML (1)
- Unified Modeling Language (1)
- Wafer (1)
- Wasserbrücke (1)
- Wasserstoffperoxid (1)
- Wellen (1)
- Workflow (1)
- Workflow Orchestration (1)
- Zug-Druck-Beanspruchung (1)
- Zug-Druck-Belastung (1)
- acetoin (1)
- activated nanostructured carbon (1)
- aktivierte nanostrukturierte Kohlenstofffaser (1)
- ammonia gas sensors (1)
- amperometric sensor (1)
- antimony doped tin oxide (1)
- autofluorescence-based detection system (1)
- biopotential electrodes (1)
- burst pressure (1)
- burst tests (1)
- capacitive field-effect biosensor (1)
- capillary micro-droplet cell (1)
- carcinogens (1)
- catalytic decomposition (1)
- chemical reduction method (1)
- contractile tension (1)
- cross sensitivity (1)
- cytosolic water diffusion (1)
- date palm tree (1)
- design-by-analysis (1)
- doped metal oxide (1)
- doped silicon (1)
- doping (1)
- electrical capacitance tomography (1)
- electro-migration (1)
- electronic noses dendronized polymers inverted mesa technology (1)
- enzymatic methods (1)
- enzyme immobilisation (1)
- enzyme immobilization (1)
- fenitrothion (1)
- finite element analysis (1)
- flaw (1)
- fluidic (1)
- gas sensor (1)
- gas sensor array (1)
- heater metallisation (1)
- hemoglobin (1)
- hemoglobin dynamics (1)
- high-temperature stability (1)
- humidity (1)
- hydrogel (1)
- hydrogen peroxide (1)
- image sensor (1)
- imaging (1)
- impedance spectroscopy (1)
- ion-selective electrodes (1)
- kontraktile Spannung (1)
- lab-on-a-chip (1)
- lab-on-chip (1)
- layer expansion (1)
- lenslet array (1)
- light scattering analysis (1)
- lightning flash (1)
- limit and shakedown analysis (1)
- limit load (1)
- linear kinematic hardening (1)
- load carrying capacity (1)
- load limit (1)
- lower bound theorem (1)
- magnetic particles (1)
- material shakedown (1)
- matrix method (1)
- mechanical waves (1)
- metal oxide (1)
- microreactor (1)
- microwave generation (1)
- modeling biosensor (1)
- modelling (1)
- modified electrode (1)
- multi-interface measurement (1)
- nanostructured carbonized plant parts (1)
- nanostrukturierte carbonisierte Pflanzenteile (1)
- nitrogen oxides (1)
- nonlinear kinematic hardening (1)
- nonlinear optimization (1)
- nonlinear solids (1)
- nonlinear tensor constitutive equation (1)
- organic PVC membranes (1)
- pH-based biosensing (1)
- pattern-size reduction (1)
- pipes (1)
- plant scanner (1)
- plasma generated ions (1)
- plastic deformation (1)
- polymer composites (1)
- porous Pt electrode (1)
- principal component (1)
- probabilistic fracture mechanics (1)
- protein (1)
- quantum charging (1)
- reliability (1)
- rhAPC (1)
- screen-printing (1)
- second-order reliability method (1)
- self-aligned patterning (1)
- sensing properties (1)
- sensors (1)
- sterilisation (1)
- subsurface ice research (1)
- subsurface probe (1)
- surface modification (1)
- swift heavy ions (1)
- tension–torsion loading (1)
- thick-film technology (1)
- thin-film microsensors (1)
- vessels (1)
- voltammetry (1)
- wafer-level testing (1)
- water bridge phenomenon (1)
The applicability of differential pulse voltammetry (DPV) and adsorptive stripping voltammetry (AdSV) at a non-toxic meniscus-modified silver solid amalgam electrode (m-AgSAE) for the determination of trace amounts of genotoxic substances was demonstrated on the determination of micromolar and submicromolar concentrations of 3-nitrofluoranthene using methanol - 0.01 mol L-1 NaOH (9:1) mixture as a base electrolyte and of Ostazine Orange using 0.01 mol L-1 NaOH as a base electrolyte.
The interest in PET detectors with monolithic block scintillators is growing. In order to obtain high spatial resolutions dedicated positioning algorithms are required. But even an ideal algorithm can only deliver information which is provided by the detector. In this simulation study we investigated the light distribution on one surface of cuboid LSO scintillators of different size. Scintillators with a large aspect ratio (small footprint and large height) showed significant position information only for a minimum interaction depth of the gamma particle. The results allow a quantitative estimate for a useful aspect ratio.
In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm.
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.
Recently, the SHARP Corporation, Japan, has developed the world’s first "Plasma Cluster Ions (PCI)" air purification technology using plasma discharge to generate cluster ions. The new plasma cluster device releases positive and negative ions into the air, which are able to decompose and deactivate harmful airborne substances by chemical reactions. Because cluster ions consist of positive and negative ions that normally exist in the natural world, they are completely harmless and safe to humans. The amount of ozone generated by cluster ions is less than 0.01 ppm, which is significantly less than the 0.05-ppm standard for industrial operations and consumer electronics. This amount, thus, has no harming effects whatsoever on the human body. But particular properties and chemical processes in PCI treatment are still under study. It has been shown that PCI in most cases show strongly pronounced irreversible killing effects in respect of airborne microflora due to free-radical induced reactions and can be considered as a potent technology to disinfect both home, medical and industrial appliances.
The progress in natural language processing (NLP) research over the last years, offers novel business opportunities for companies, as automated user interaction or improved data analysis. Building sophisticated NLP applications requires dealing with modern machine learning (ML) technologies, which impedes enterprises from establishing successful NLP projects. Our experience in applied NLP research projects shows that the continuous integration of research prototypes in production-like environments with quality assurance builds trust in the software and shows convenience and usefulness regarding the business goal. We introduce STAMP 4 NLP as an iterative and incremental process model for developing NLP applications. With STAMP 4 NLP, we merge software engineering principles with best practices from data science. Instantiating our process model allows efficiently creating prototypes by utilizing templates, conventions, and implementations, enabling developers and data scientists to focus on the business goals. Due to our iterative-incremental approach, businesses can deploy an enhanced version of the prototype to their software environment after every iteration, maximizing potential business value and trust early and avoiding the cost of successful yet never deployed experiments.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Study of swift heavy ion modified conduction polymer composites for application as gas sensor
(2006)
A polyaniline-based conducting composite was prepared by oxidative polymerisation of aniline in a polyvinylchloride (PVC) matrix. The coherent free standing thin films of the composite were prepared by a solution casting method. The polyvinyl chloride-polyaniline composites exposed to 120 MeV ions of silicon with total ion fluence ranging from 1011 to 1013 ions/cm2, were observed to be more sensitive towards ammonia gas than the unirradiated composite. The response time of the irradiated composites was observed to be comparably shorter. We report for the first time the application of swift heavy ion modified insulating polymer conducting polymer (IPCP) composites for sensing of ammonia gas.
A capacitive electrolyte-insulator-semiconductor (EISCAP) biosensor modified with Tobacco mosaic virus (TMV) particles for the detection of acetoin is presented. The enzyme acetoin reductase (AR) was immobilized on the surface of the EISCAP using TMV particles as nanoscaffolds. The study focused on the optimization of the TMV-assisted AR immobilization on the Ta 2 O 5 -gate EISCAP surface. The TMV-assisted acetoin EISCAPs were electrochemically characterized by means of leakage-current, capacitance-voltage, and constant-capacitance measurements. The TMV-modified transducer surface was studied via scanning electron microscopy.
WS GTaD-2003 - The 1st Workshop on Graph Transformations and Design ed Grabska, E., Seite 6-7, Jagiellonian University Krakow. 2 pages
In this paper, methods of surface modification of different supports, i.e. glass and polymeric beads for enzyme immobilisation are described. The developed method of enzyme immobilisation is based on Schiff’s base formation between the amino groups on the enzyme surface and the aldehyde groups on the chemically modified surface of the supports. The surface of silicon modified by APTS and GOPS with immobilised enzyme was characterised by atomic force microscopy (AFM), time-of-flight secondary ion mass spectroscopy (ToF-SIMS) and infrared spectroscopy (FTIR). The supports with immobilised enzyme (urease) were also tested in combination with microreactors fabricated in silicon and Perspex, operating in a flow-through system. For microreactors filled with urease immobilised on glass beads (Sigma) and on polymeric beads (PAN), a very high and stable signal (pH change) was obtained. The developed method of urease immobilisation can be stated to be very effective.
Structural design analyses are conducted with the aim of verifying the exclusion of ratcheting. To this end it is important to make a clear distinction between the shakedown range and the ratcheting range. In cyclic plasticity more sophisticated hardening models have been suggested in order to model the strain evolution observed in ratcheting experiments. The hardening models used in shakedown analysis are comparatively simple. It is shown that shakedown analysis can make quite stable predictions of admissible load ranges despite the simplicity of the underlying hardening models. A linear and a nonlinear kinematic hardening model of two-surface plasticity are compared in material shakedown analysis. Both give identical or similar shakedown ranges. Structural shakedown analyses show that the loading may have a more pronounced effect than the hardening model.
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
In energy economy forecasts of different time series are rudimentary. In this study, a prediction for the German day-ahead spot market is created with Apache Spark and R. It is just an example for many different applications in virtual power plant environments. Other examples of use as intraday price processes, load processes of machines or electric vehicles, real time energy loads of photovoltaic systems and many more time series need to be analysed and predicted.
This work gives a short introduction into the project where this study is settled. It describes the time series methods that are used in energy industry for forecasts shortly. As programming technique Apache Spark, which is a strong cluster computing technology, is utilised. Today, single time series can be predicted. The focus of this work is on developing a method to parallel forecasting, to process multiple time series simultaneously with R and Apache Spark.
Abstract of the authors: In many areas of computer science ontologies become more and more important. The use of ontologies for domain modeling often brings up the issue of ontology integration. The task of merging several ontologies, covering specific subdomains, into one united ontology has to be solved. Many approaches for ontology integration aim at automating the process of ontology alignment. However, a complete automation is not feasible, and user interaction is always required. Nevertheless, most ontology integration tools offer only very limited support for the interactive part of the integration process. In this paper, we present a novel approach for the interactive integration of ontologies. The result of the ontology integration is incrementally updated after each definition of a correspondence between ontology elements. The user is guided through the ontologies to be integrated. By restricting the possible user actions, the integrity of all defined correspondences is ensured by the tool we developed. We evaluated our tool by integrating different regulations concerning building design.
Fields of asymmetric tensors play an important role in many applications such as medical imaging (diffusion tensor magnetic resonance imaging), physics, and civil engineering (for example Cauchy-Green-deformation tensor, strain tensor with local rotations, etc.). However, such asymmetric tensors are usually symmetrized and then further processed. Using this procedure results in a loss of information. A new method for the processing of asymmetric tensor fields is proposed restricting our attention to tensors of second-order given by a 2x2 array or matrix with real entries. This is achieved by a transformation resulting in Hermitian matrices that have an eigendecomposition similar to symmetric matrices. With this new idea numerical results for real-world data arising from a deformation of an object by external forces are given. It is shown that the asymmetric part indeed contains valuable information.
Hypertension describes the pathological increase of blood pressure, which is most commonly associated with the increase of vascular wall stiffness [1]. Referring to the “Deutsche Bluthochdruck Liga” this pathology shows a growing trend in our aging society. In order to find novel pharmacological and probably personalized treatments, we want to present a functional approach to study biomechanical properties of a human aortic vascular model.
In this method review we will give an overview of recent studies which were carried out with the CellDrum technology [2] and underline the added value to already existing standard procedures known from the field of physiology.
Herein described CellDrum technology is a system to measure functional mechanical properties of cell monolayers and thin tissue constructs in-vitro. Additionally, the CellDrum enables to elucidate the mechanical response of cells to pharmacological drugs, toxins and vasoactive agents. Due to its highly flexible polymer support, cells can also be mechanically stimulated by steady and cyclic biaxial stretching.
IASSE-2004 - 13th International Conference on Intelligent and Adaptive Systems and Software Engineering eds. W. Dosch, N. Debnath, pp. 245-250, ISCA, Cary, NC, 1-3 July 2004, Nice, France We introduce a UML-based model for conceptual design support in civil engineering. Therefore, we identify required extensions to standard UML. Class diagrams are used for elaborating building typespecific knowledge: Object diagrams, implicitly contained in the architect’s sketch, are validated against the defined knowledge. To enable the use of industrial, domain-specific tools, we provide an integrated conceptual design extension. The developed tool support is based on graph rewriting. With our approach architects are enabled to deal with semantic objects during early design phase, assisted by incremental consistency checks.
Clearance of blood components and fluid drainage play a crucial role in subarachnoid hemorrhage (SAH) and post hemorrhagic hydrocephalus (PHH). With the involvement of interstitial fluid (ISF) and cerebrospinal fluid (CSF), two pathways for the clearance of fluid and solutes in the brain are proposed. Starting at the level of capillaries, flow of ISF follows along the basement membranes in the walls of cerebral arteries out of the parenchyma to drain into the lymphatics and CSF [1]–[3]. Conversely, it is shown that CSF enters the parenchyma between glial and pial basement membranes of penetrating arteries [4]–[6]. Nevertheless, the involved structures and the contribution of either flow pathway to fluid balance between the subarachnoid space and interstitial space remains controversial. Low frequency oscillations in vascular tone are referred to as vasomotion and corresponding vasomotion waves are modeled as the driving force for flow of ISF out of the parenchyma [7]. Retinal vessel analysis (RVA) allows non-invasive measurement of retinal vessel vasomotion with respect to diameter changes [8]. Thus, the aim of the study is to investigate vasomotion in RVA signals of SAH and PHH patients.
Proc. of the 2005 ASCE Intl. Conf. on Computing in Civil Engineering (ICCC 2005) eds. L. Soibelman und F. Pena-Mora, Seite 1-14, ASCE (CD-ROM), Cancun, Mexico, 2005 Current CAD tools are not able to support the fundamental conceptual design phase, and none of them provides consistency analyses of sketches produced by architects. To give architects a greater support at the conceptual design phase, we develop a CAD tool for conceptual design and a knowledge specification tool allowing the definition of conceptually relevant knowledge. The knowledge is specific to one class of buildings and can be reused. Based on a dynamic knowledge model, different types of design rules formalize the knowledge in a graph-based realization. An expressive visual language provides a user-friendly, human readable representation. Finally, consistency analyses enable conceptual designs to be checked against this defined knowledge. In this paper we concentrate on the knowledge specification part of our project.
One of interesting but not well known water properties is related to appearance of highly ordered structures in response to strong electrical field. In 1893 Sir William Armstrong placed a cotton thread between two wine glasses filled with chemically pure water. When high DC voltage was applied between the glasses, a connection consisting of water formed, producing a "water bridge"
The workflow of a high throughput screening setup for the rapid identification of new and improved sensor materials is presented. The polyol method was applied to prepare nanoparticular metal oxides as base materials, which were functionalised by surface doping. Using multi-electrode substrates and high throughput impedance spectroscopy (HT-IS) a wide range of materials could be screened in a short time. Applying HT-IS in search of new selective gas sensing materials a NO2-tolerant NO sensing material with reduced sensitivities towards other test gases was identified based on iridium doped zinc oxide. Analogous behaviour was observed for iridium doped indium oxide.
Im Rahmen von Ermüdungsanalysen ist nachzuweisen, daß die thermisch bedingten fortschreitenden Deformationen begrenzt bleiben. Hierzu ist die Abgrenzung des Shakedown-Bereiches (Einspielen) vom Ratchetting-Bereich (fortschreitende Deformation) von Interesse. Im Rahmen eines EU-geförderten Forschungsvorhabens wurden Experimente mit einem 4-Stab-Modell durchgeführt. Das Experiment bestand aus einem wassergekühlten inneren Rohr und drei isolierten und beheizbaren äußeren Probestäben. Das System wurde durch alternierende Axialkräfte, denen alternierende Temperaturen an den äußeren Stäben überlagert wurden, belastet. Die Versuchsparameter wurden teilweise nach vorausgegangenen Einspielanalysen gewählt. Während der Versuchsdurchführung wurden Temperaturen und Dehnungen zeitabhängig gemessen. Begleitend und nachfolgend zur Versuchsdurchführung wurden die Belastungen und die daraus resultierenden Beanspruchungen nachvollzogen. Bei dieser inkrementellen elasto-plastischen Analyse mit dem Programm ANSYS wurden unterschiedliche Werkstoffmodelle angesetzt. Die Ergebnisse dieser Simulationsberechnung dienen dazu, die Shakedown-Analysen mittels FE-Methode zu verifizieren.