Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (241) (remove)
Document Type
- Conference Proceeding (241) (remove)
Keywords
- Biosensor (25)
- CAD (11)
- Finite-Elemente-Methode (11)
- civil engineering (11)
- Bauingenieurwesen (10)
- Einspielen <Werkstoff> (6)
- shakedown analysis (6)
- Clusterion (4)
- Limit analysis (4)
- Natural language processing (4)
Limit and shakedown theorems are exact theories of classical plasticity for the direct computation of safety factors or of the load carrying capacity under constant and varying loads. Simple versions of limit and shakedown analysis are the basis of all design codes for pressure vessels and pipings. Using Finite Element Methods more realistic modeling can be used for a more rational design. The methods can be extended to yield optimum plastic design. In this paper we present a first implementation in FE of limit and shakedown analyses for perfectly plastic material. Limit and shakedown analyses are done of a pipe–junction and a interaction diagram is calculated. The results are in good correspondence with the analytic solution we give in the appendix.
Direct methods comprising limit and shakedown analysis is a branch of computational mechanics. It plays a significant role in mechanical and civil engineering design. The concept of direct method aims to determinate the ultimate load bearing capacity of structures beyond the elastic range. For practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and onstraints. If strength and loading are random quantities, the problem of shakedown analysis is considered as stochastic programming. This paper presents a method so called chance constrained programming, an effective method of stochastic programming, to solve shakedown analysis problem under random condition of strength. In this our investigation, the loading is deterministic, the strength is distributed as normal or lognormal variables.
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
Traglast- und Einspielanalysen sind vereinfachte doch exakte Verfahren der Plastizität, die neben ausreichender Verformbarkeit keine einschränkenden Voraussetzungen beinhalten. Die Vereinfachungen betreffen die Beschaffung der Daten und Modelle für Details der Lastgeschichte und des Stoffverhaltens. Anders als die klassische Behandlung nichtlinearer Probleme der Strukturmechanik führt die Methode auf Optimierungsprobleme. Diese sind bei realistischen FEM-Modellen sehr groß. Das hat die industrielle Anwendung der Traglast- und Einspielanalysen stark verzögert. Diese Situation wird durch das Brite-EuRam Projekt LISA grundlegend geändert. In LISA entsteht auf der Basis des industriellen FEM-Programms PERMAS ein Verfahren zur direkten Berechnung der Tragfähigkeit duktiler Strukturen. Damit kann der Betriebsbereich von Komponenten und Bauwerken auf den plastischen Bereich erweitert werden, ohne den Aufwand gegenüber elastischen Analysen wesentlich zu erhöhen. Die beachtlichen Rechenzeitgewinne erlauben Parameterstudien und die Berechnung von Interaktionsdiagrammen, die einen schnellen Überblick über mögliche Betriebsbereiche vermitteln. Es zeigt sich, daß abhängig von der Komponente und ihren Belastungen teilweise entscheidende Sicherheitsgewinne zur Erweiterung der Betriebsbereiche erzielt werden können. Das Vorgehen erfordert vom Anwender oft ein gewisses Umdenken. Es werden keine Spannungen berechnet, um damit Sicherheit und Lebensdauer zu interpretieren. Statt dessen berechnet man direkt die gesuchte Sicherheit. Der Post-Prozessor wird nur noch zur Modell- und Rechenkontrolle benötigt. Das Vorgehen ist ähnlich der Stabilitätsanalyse (Knicken, Beulen). Durch namhafte industrielle Projektpartner werden Validierung und die Anwendbarkeit auf eine breite Palette technischer Probleme garantiert. Die ebenfalls in LISA entwickelten Zuverlässigkeitsanalysen sind nichlinear erst auf der Basis direkter Verfahren effektiv möglich. Ohne Traglast- und Einspielanalyse ist plastische Strukturoptimierung auch heute kaum durchführbar. Auf die vorgesehenen Erweiterungen der Werkstoffmodellierung für nichtlineare Verfestigung und für Schädigung konnte hier nicht eingegangen werden. Es herrscht ein deutlicher Mangel an Experimenten zum Nachweis der Grenzen zwischen elastischem Einspielen und dem Versagen durch LCF oder durch Ratchetting.
Traglast- und Einspielanalysen sind vereinfachte doch exakte Verfahren der Plastizität, die neben ausreichender Verformbarkeit keine einschränkenden Voraussetzungen beinhalten. Die Vereinfachungen betreffen die Beschaffung der Daten und Modelle für Details der Lastgeschichte und des Stoffverhaltens. Anders als die klassische Behandlung nichtlinearer Probleme der Strukturmechanik führt die Methode auf Optimierungsprobleme. Diese sind bei realistischen FEM-Modellen sehr groß. Das hat die industrielle Anwendung der Traglast- und Einspielanalysen stark verzögert. Diese Situation wird durch das Brite-EuRam Projekt LISA grundlegend geändert. Die Autoren möchten der Europäischen Kommission an dieser Stelle für die Förderung ausdrücklich danken. In LISA entsteht auf der Basis des industriellen FEM-Programms PERMAS ein Verfahren zur direkten Berechnung der Tragfähigkeit duktiler Strukturen. Damit kann der Betriebsbereich von Komponenten und Bauwerken auf den plastischen Bereich erweitert werden, ohne den Aufwand gegenüber elastischen Analysen wesentlich zu erhöhen. Die beachtlichen Rechenzeitgewinne erlauben Parameterstudien und die Berechnung von Interaktionsdiagrammen, die einen schnellen Überblick über mögliche Betriebsbereiche vermitteln. Es zeigt sich, daß abhängig von der Komponente und ihren Belastungen teilweise entscheidende Sicherheitsgewinne zur Erweiterung der Betriebsbereiche erzielt werden können. Das Vorgehen erfordert vom Anwender oft ein gewisses Umdenken. Es werden keine Spannungen berechnet, um damit Sicherheit und Lebensdauer zu interpretieren. Statt dessen berechnet man direkt die gesuchte Sicherheit. Der Post-Prozessor wird nur noch zur Modell- und Rechenkontrolle benötigt. Das Vorgehen ist änhlich der Stabilitätsanalyse (Knicken, Beulen). Durch namhafte industrielle Projektpartner werden Validierung und die Anwendbarkeit auf eine breite Palette technischer Probleme garantiert. Die ebenfalls in LISA geplante Zuverlässigkeitsanalyse ist erst auf der Basis direkter Verfahren effektiv möglich. Ohne Traglast- und Einspielanalyse ist plastische Strukturoptimierung auch heute kaum durchführbar.
Magnetic nanoparticles (MNP) are investigated with great interest for biomedical applications in diagnostics (e.g. imaging: magnetic particle imaging (MPI)), therapeutics (e.g. hyperthermia: magnetic fluid hyperthermia (MFH)) and multi-purpose biosensing (e.g. magnetic immunoassays (MIA)). What all of these applications have in common is that they are based on the unique magnetic relaxation mechanisms of MNP in an alternating magnetic field (AMF). While MFH and MPI are currently the most prominent examples of biomedical applications, here we present results on the relatively new biosensing application of frequency mixing magnetic detection (FMMD) from a simulation perspective. In general, we ask how the key parameters of MNP (core size and magnetic anisotropy) affect the FMMD signal: by varying the core size, we investigate the effect of the magnetic volume per MNP; and by changing the effective magnetic anisotropy, we study the MNPs’ flexibility to leave its preferred magnetization direction. From this, we predict the most effective combination of MNP core size and magnetic anisotropy for maximum signal generation.
Hydrophobic magnetic nanoparticles (NPs) consisting of undecanoate-capped magnetite (Fe3O4, average diameter ca. 5 nm) are used to control quantized electron transfer to surface-confined redox units and metal NPs. A two-phase system consisting of an aqueous electrolyte solution and a toluene phase that includes the suspended undecanoatecapped magnetic NPs is used to control the interfacial properties of the electrode surface. The attracted magnetic NPs form a hydrophobic layer on the electrode surface resulting in the change of the mechanisms of the surface-confined electrochemical processes. A quinone-monolayer modified Au electrode demonstrates an aqueous-type of the electrochemical process (2e-+2H+ redox mechanism) for the quinone units in the absence of the hydrophobic magnetic NPs, while the attraction of the magnetic NPs to the surface results in the stepwise single-electron transfer mechanism characteristic of a dry nonaqueous medium. Also, the attraction of the hydrophobic magnetic NPs to the Au electrode surface modified with Au NPs (ca. 1.4 nm) yields a microenvironment with a low dielectric constant that results in the single-electron quantum charging of the Au NPs.
Market abstraction of energy markets and policies - application in an agent-based modeling toolbox
(2023)
In light of emerging challenges in energy systems, markets are prone to changing dynamics and market design. Simulation models are commonly used to understand the changing dynamics of future electricity markets. However, existing market models were often created with specific use cases in mind, which limits their flexibility and usability. This can impose challenges for using a single model to compare different market designs. This paper introduces a new method of defining market designs for energy market simulations. The proposed concept makes it easy to incorporate different market designs into electricity market models by using relevant parameters derived from analyzing existing simulation tools, morphological categorization and ontologies. These parameters are then used to derive a market abstraction and integrate it into an agent-based simulation framework, allowing for a unified analysis of diverse market designs. Furthermore, we showcase the usability of integrating new types of long-term contracts and over-the-counter trading. To validate this approach, two case studies are demonstrated: a pay-as-clear market and a pay-as-bid long-term market. These examples demonstrate the capabilities of the proposed framework.
This paper presents a two-dimensional-in-space mathematical model of biosensors based on an array of enzyme microreactors immobilised on a single electrode. The modeling system acts under amperometric conditions. The microreactors were modeled by particles and by strips. The model is based on the diffusion equations containing a nonlinear term related to the Michaelis-Menten kinetics of the enzymatic reaction. The model involves three regions: an array of enzyme microreactors where enzyme reaction as well as mass transport by diffusion takes place, a diffusion limiting region where only the diffusion takes place, and a convective region, where the analyte concentration is maintained constant. Using computer simulation, the influence of the geometry of the microreactors and of the diffusion region on the biosensor response was investigated. The digital simulation was carried out using the finite difference technique.
Research collaborations provide opportunities for both practitioners and researchers: practitioners need solutions for difficult business challenges and researchers are looking for hard problems to solve and publish. Nevertheless, research collaborations carry the risk that practitioners focus on quick solutions too much and that researchers tackle theoretical problems, resulting in products which do not fulfill the project requirements.
In this paper we introduce an approach extending the ideas of agile and lean software development. It helps practitioners and researchers keep track of their common research collaboration goal: a scientifically enriched software product which fulfills the needs of the practitioner’s business model.
This approach gives first-class status to application-oriented metrics that measure progress and success of a research collaboration continuously. Those metrics are derived from the collaboration requirements and help to focus on a commonly defined goal.
An appropriate tool set evaluates and visualizes those metrics with minimal effort, and all participants will be pushed to focus on their tasks with appropriate effort. Thus project status, challenges and progress are transparent to all research collaboration members at any time.
A new and simple method for nanostructuring using conventional photolithography and layer expansion or pattern-size reduction technique is presented, which can further be applied for the fabrication of different nanostructures and nano-devices. The method is based on the conversion of a photolithographically patterned metal layer to a metal-oxide mask with improved pattern-size resolution using thermal oxidation. With this technique, the pattern size can be scaled down to several nanometer dimensions. The proposed method is experimentally demonstrated by preparing nanostructures with different configurations and layouts, like circles, rectangles, trapezoids, “fluidic-channel”-, “cantilever”- and meander-type structures.
Various planar technologies are employed for developing solid-state sensors having low cost, small size and high reproducibility; thin- and thick-film technologies are most suitable for such productions. Screen-printing is especially suitable due to its simplicity, low-cost, high reproducibility and efficiency in large-scale production. This technology enables the deposition of a thick layer and allows precise pattern control. Moreover, this is a highly economic technology, saving large amounts of the used inks. In the course of repetitions of the film-deposition procedure there is no waste of material due to additivity of this thick-film technology. Finally, the thick films can be easily and quickly deposited on inexpensive substrates. In this contribution, thick-film ion-selective electrodes based on ionophores as well as crystalline ion-selective materials dedicated for potentiometric measurements are demonstrated. Analytical parameters of these sensors are comparable with those reported for conventional potentiometric electrodes. All mentioned thick-film strip electrodes have been totally fabricated in only one, fully automated thickfilm technology, without any additional manual, chemical or electrochemical steps. In all cases simple, inexpensive, commercially available materials, i.e. flexible, plastic substrates and easily cured polymer-based pastes were used.