Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (252) (remove)
Document Type
- Conference Proceeding (252) (remove)
Keywords
- Biosensor (25)
- CAD (11)
- Finite-Elemente-Methode (11)
- civil engineering (11)
- Bauingenieurwesen (10)
- Einspielen <Werkstoff> (6)
- shakedown analysis (6)
- Clusterion (4)
- Limit analysis (4)
- Natural language processing (4)
- limit analysis (4)
- Air purification (3)
- Hämoglobin (3)
- Luftreiniger (3)
- Plasmacluster ion technology (3)
- Raumluft (3)
- Shakedown (3)
- Shakedown analysis (3)
- Sonde (3)
- Traglast (3)
- Bruchmechanik (2)
- Clustering (2)
- Einspielanalyse (2)
- Eisschicht (2)
- Erythrozyt (2)
- FEM (2)
- Information extraction (2)
- Kohlenstofffaser (2)
- Lipopolysaccharide (2)
- Ratcheting (2)
- Stickstoffmonoxid (2)
- Traglastanalyse (2)
- biosensor (2)
- celldrum technology (2)
- humans (2)
- lipopolysaccharides (2)
- nitric oxide gas (2)
- ratchetting (2)
- shakedown (2)
- 3-nitrofluoranthene (1)
- Active learning (1)
- Adsorption (1)
- Agent-based modeling (1)
- Agent-based simulation (1)
- Analytical models (1)
- Analytischer Zulaessigkeitsnachweis (1)
- Anastomose (1)
- Anastomosis (1)
- Autofluoreszenzverfahren (1)
- BTEX compounds (1)
- Bakterien (1)
- Bio-Sensors (1)
- Biomechanics (1)
- Biomechanik (1)
- Biomedizinische Technik (1)
- Biophoton (1)
- Biosensorik (1)
- Blitzschutz (1)
- CAD ; (1)
- CO (1)
- Chance constrained programming (1)
- Cloud Computing (1)
- Cloud Service Broker (1)
- Comparative simulation (1)
- Conducing polymer (1)
- Database (1)
- Dattel (1)
- Deep learning (1)
- Dekontamination (1)
- Druckbeanspruchung (1)
- Druckbehälter (1)
- Druckbelastung (1)
- ECT (1)
- EEG (1)
- EPN (1)
- Einspiel-Analyse (1)
- Elastodynamik (1)
- Elektrodynamik (1)
- Endothelzelle (1)
- Energy dispatch (1)
- Energy market (1)
- Energy market design (1)
- Evolution of damage (1)
- Exact Ilyushin yield surface (1)
- Extension fracture (1)
- Extension strain criterion (1)
- FEM-Programm (1)
- FEM-computation (1)
- Fehlerstellen (1)
- Festkörper (1)
- Fibroblast (1)
- Finite element method (1)
- First Order Reliabiblity Method (1)
- First-order reliability method (1)
- Fluorescence (1)
- Focusing (1)
- Force (1)
- GaAs hot electron injector (1)
- Gas sensor (1)
- Grid Computing (1)
- Gunn diode (1)
- Heavy metal detection (1)
- High throughput experimentation (1)
- Hotplate (1)
- Human Factors (1)
- Hydrodynamik (1)
- Hydrogel (1)
- Hydrogen sensor (1)
- I3S 2005 (1)
- ISFET (1)
- Impedance Spectroscopy (1)
- Information Extraction (1)
- Information Integration Tools (1)
- Instruments (1)
- International Symposium on Sensor Science (1)
- Iterative learning control (1)
- Knee (1)
- Knowledge Management (1)
- Körpertemperatur (1)
- LED chip (1)
- LISA (1)
- Level sensor (1)
- Lichtstreuungsbasierte Instrumente (1)
- Load modeling (1)
- MEMS (1)
- Machine learning (1)
- Main sensitivity (1)
- Market modeling (1)
- Measurement (1)
- Mechanische Beanspruchung (1)
- Microreactors (1)
- Mohr–Coulomb criterion (1)
- Multi-dimensional wave propagation (1)
- Nano Materials (1)
- Nanomaterial (1)
- Nanoparticles (1)
- Nanopartikel (1)
- Nanostructuring (1)
- Nanotechnologie (1)
- Nanotechnology ; Microelectronics ; Biosensors ; Superconductor ; MEMS (1)
- Natriumhypochlorit (1)
- Natural Language Processing (1)
- Natural language understanding (1)
- Nichtlineare Gleichung (1)
- Nichtlineare Optimierung (1)
- Nichtlineare Welle (1)
- Ontologie <Wissensverarbeitung> (1)
- Ontology Engineering (1)
- Open Data (1)
- Open source (1)
- Organophosphorus (1)
- Ostazine Orange (1)
- PFM (1)
- Pflanzenphysiologie (1)
- Pflanzenscanner (1)
- Phenylalanine determination (1)
- Potentiometry (1)
- Process model (1)
- Profile Extraction (1)
- Profile extraction (1)
- Proteine (1)
- Pseudomonas putida (1)
- Quartz crystal nanobalance (QCN) (1)
- Quartz micro balances (1)
- Query learning (1)
- Random variable (1)
- Reaction-diffusion (1)
- Refining (1)
- Relation classification (1)
- Reliability of structures (1)
- Renewable energy sources (1)
- Reproducible research (1)
- Rohr (1)
- Rohrbruch (1)
- Sensitivity (1)
- Sepsis (1)
- Simulation (1)
- Sleep EEG (1)
- Solid amalgam electrodes (1)
- Stahl (1)
- Stochastic programming (1)
- Supraleiter (1)
- Technische Mechanik (1)
- Text Mining (1)
- Text mining (1)
- Time-series (1)
- Tin oxide (1)
- Tobacco mosaic virus (1)
- Torsion (1)
- Torsionsbelastung (1)
- Tragfähigkeit (1)
- Training (1)
- Trustworthy artificial intelligence (1)
- UML (1)
- Unified Modeling Language (1)
- Wafer (1)
- Wasserbrücke (1)
- Wasserstoffperoxid (1)
- Wellen (1)
- Workflow (1)
- Workflow Orchestration (1)
- XML (1)
- Zug-Druck-Beanspruchung (1)
- Zug-Druck-Belastung (1)
- access control (1)
- acetoin (1)
- activated nanostructured carbon (1)
- aktivierte nanostrukturierte Kohlenstofffaser (1)
- ammonia gas sensors (1)
- amperometric sensor (1)
- antimony doped tin oxide (1)
- authorization (1)
- autofluorescence-based detection system (1)
- biopotential electrodes (1)
- burst pressure (1)
- burst tests (1)
- capacitive field-effect biosensor (1)
- capillary micro-droplet cell (1)
- carcinogens (1)
- catalytic decomposition (1)
- chemical reduction method (1)
- concrete (1)
- containers (1)
- contractile tension (1)
- cross sensitivity (1)
- cytosolic water diffusion (1)
- date palm tree (1)
- design-by-analysis (1)
- distribution strategy (1)
- doped metal oxide (1)
- doped silicon (1)
- doping (1)
- electrical capacitance tomography (1)
- electro-migration (1)
- electronic noses dendronized polymers inverted mesa technology (1)
- engines (1)
- enzymatic methods (1)
- enzyme immobilisation (1)
- enzyme immobilization (1)
- fenitrothion (1)
- finite element analysis (1)
- flaw (1)
- fluidic (1)
- framework (1)
- gas sensor (1)
- gas sensor array (1)
- grid computing (1)
- heater metallisation (1)
- hemoglobin (1)
- hemoglobin dynamics (1)
- high-temperature stability (1)
- history (1)
- humidity (1)
- hydrogel (1)
- hydrogen peroxide (1)
- image sensor (1)
- imaging (1)
- impedance spectroscopy (1)
- ion-selective electrodes (1)
- kontraktile Spannung (1)
- lab-on-a-chip (1)
- lab-on-chip (1)
- layer expansion (1)
- lenslet array (1)
- libraries (1)
- light scattering analysis (1)
- lightning flash (1)
- limit and shakedown analysis (1)
- limit load (1)
- linear kinematic hardening (1)
- load carrying capacity (1)
- load limit (1)
- lower bound theorem (1)
- magnetic particles (1)
- material shakedown (1)
- matrix method (1)
- mechanical waves (1)
- metal oxide (1)
- microreactor (1)
- microwave generation (1)
- modeling biosensor (1)
- modelling (1)
- modified electrode (1)
- multi-interface measurement (1)
- nanostructured carbonized plant parts (1)
- nanostrukturierte carbonisierte Pflanzenteile (1)
- nitrogen oxides (1)
- nonlinear kinematic hardening (1)
- nonlinear optimization (1)
- nonlinear solids (1)
- nonlinear tensor constitutive equation (1)
- organic PVC membranes (1)
- pH-based biosensing (1)
- pattern-size reduction (1)
- pipes (1)
- plant scanner (1)
- plasma generated ions (1)
- plastic deformation (1)
- polymer composites (1)
- porous Pt electrode (1)
- principal component (1)
- probabilistic fracture mechanics (1)
- protein (1)
- provenance (1)
- quantum charging (1)
- reliability (1)
- resource management (1)
- rhAPC (1)
- scheduling (1)
- screen-printing (1)
- second-order reliability method (1)
- security (1)
- self-aligned patterning (1)
- sensing properties (1)
- sensors (1)
- sterilisation (1)
- subsurface ice research (1)
- subsurface probe (1)
- surface modification (1)
- swift heavy ions (1)
- synchronization (1)
- tension–torsion loading (1)
- thick-film technology (1)
- thin-film microsensors (1)
- vessels (1)
- voltammetry (1)
- wafer-level testing (1)
- water bridge phenomenon (1)
- workflow (1)
- workflow management software (1)
A 3D finite element model of the female pelvic floor for the reconstruction of urinary incontinence
(2014)
A concept for a sensitive micro total analysis system for high throughput fluorescence imaging
(2006)
This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis systems (µTAS). The first method relates to side illumination of the fluorescent material placed into microcompartments of the lab-on-chip. Its significance is in high utilization of excitation energy for low concentration of fluorescent material. The utilization of a transparent µLED chip, for the second method, allows the placement of the excitation light sources on the same optical axis with emission detector, such that the excitation and emission rays are directed controversly. The third method presents a spatial filtering of the excitation background.
A melting probe equipped with autofluorescence-based detection system combined with a light scattering unit, and, optionally, with a microarray chip would be ideally suited to probe icy environments like Europa’s ice layer as well as the polar ice layers of Earth and Mars for recent and extinct live.
The importance of validating and reproducing the outcome of computational processes is fundamental to many application domains. Assuring the provenance of workflows will likely become even more important with respect to the incorporation of human tasks to standard workflows by emerging standards such as WS-HumanTask. This paper addresses this trend by an actor-based workflow approach that actively support provenance. It proposes a framework to track and store provenance information automatically that applies for various workflow management systems. In particular, the introduced provenance framework supports the documentation of workflows in a legally binding way. The authors therefore use the concept of layered XML documents, i.e. history-tracing XML. Furthermore, the proposed provenance framework enables the executors (actors) of a particular workflow task to attest their operations and the associated results by integrating digital XML signatures.
A New Class of Biosensors Based on Tobacco Mosaic Virus and Coat Proteins as Enzyme Nanocarrier
(2016)
An increasing number of applications target their executions on specific hardware like general purpose Graphics Processing Units. Some Cloud Computing providers offer this specific hardware so that organizations can rent such resources. However, outsourcing the whole application to the Cloud causes avoidable costs if only some parts of the application benefit from the specific expensive hardware. A partial execution of applications in the Cloud is a tradeoff between costs and efficiency. This paper addresses the demand for a consistent framework that allows for a mixture of on- and off-premise calculations by migrating only specific parts to a Cloud. It uses the concept of workflows to present how individual workflow tasks can be migrated to the Cloud whereas the remaining tasks are executed on-premise.
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.
An application of a scanning light-addressable potentiometric sensor for label-free DNA detection
(2013)
Mathematical morphology is a part of image processing that has proven to be fruitful for numerous applications. Two main operations in mathematical morphology are dilation and erosion. These are based on the construction of a supremum or infimum with respect to an order over the tonal range in a certain section of the image. The tonal ordering can easily be realised in grey-scale morphology, and some morphological methods have been proposed for colour morphology. However, all of these have certain limitations.
In this paper we present a novel approach to colour morphology extending upon previous work in the field based on the Loewner order. We propose to consider an approximation of the supremum by means of a log-sum exponentiation introduced by Maslov. We apply this to the embedding of an RGB image in a field of symmetric 2x2 matrices. In this way we obtain nearly isotropic matrices representing colours and the structural advantage of transitivity. In numerical experiments we highlight some remarkable properties of the proposed approach.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
Reliable methods for automatic readability assessment have the potential to impact a variety of fields, ranging from machine translation to self-informed learning. Recently, large language models for the German language (such as GBERT and GPT-2-Wechsel) have become available, allowing to develop Deep Learning based approaches that promise to further improve automatic readability assessment. In this contribution, we studied the ability of ensembles of fine-tuned GBERT and GPT-2-Wechsel models to reliably predict the readability of German sentences. We combined these models with linguistic features and investigated the dependence of prediction performance on ensemble size and composition. Mixed ensembles of GBERT and GPT-2-Wechsel performed better than ensembles of the same size consisting of only GBERT or GPT-2-Wechsel models. Our models were evaluated in the GermEval 2022 Shared Task on Text Complexity Assessment on data of German sentences. On out-of-sample data, our best ensemble achieved a root mean squared error of 0:435.
Es wurde ein automatisiertes, computerunterstütztes Testsystem für die Funktionsprüfung und Charakterisierung von (bio-)chemischen Sensoren auf Waferebene entwickelt und in einen konventionellen Spitzenmessplatz integriert. Das System ermöglicht die Charakterisierung und Identifizierung „funktionstauglicher“ Sensoren bereits auf Waferebene zwischen den einzelnen Herstellungsschritten, wodurch weitere, bisher übliche Verarbeitungsschritte wie das Fixieren, Bonden und Verkapseln für die defekten oder nicht funktionstauglichen Sensorstrukturen entfällt. Außerdem bietet eine speziell entworfene miniaturisierte Durchflussmesszelle die Möglichkeit, bereits auf Waferlevel die Sensitivität, Drift, Hysterese und Ansprechzeit der (bio-)chemischen Sensoren zu charakterisieren. Das System wurde exemplarisch mit kapazitiven, pH-sensitiven EIS- (Elektrolyt-Isolator-Silizium) Strukturen und ISFET- (ionensensitiver Feldeffekttransistor) Strukturen mit verschiedenen Geometrien und Gate-Layouts getestet.
The overall objective of this study is to develop a new external fixator, which closely maps the native kinematics of the elbow to decrease the joint force resulting in reduced rehabilitation time and pain. An experimental setup was designed to determine the native kinematics of the elbow during flexion of cadaveric arms. As a preliminary study, data from literature was used to modify a published biomechanical model for the calculation of the joint and muscle forces. They were compared to the original model and the effect of the kinematic refinement was evaluated. Furthermore, the obtained muscle forces were determined in order to apply them in the experimental setup. The joint forces in the modified model differed slightly from the forces in the original model. The muscle force curves changed particularly for small flexion angles but their magnitude for larger angles was consistent.
Biomechanical simulation of different prosthetic meshes for repairing uterine/vaginal vault prolapse
(2017)
Tests with palm tree leaves have just started yet and scan data are in the process to be analyzed. The final goal of future project for palm tree gender and species recognition will be to develop optical scanning technology to be applied to date palm tree leaves for in–situ screening purposes. Depending on the software used and the particular requirements of the users the technology potentially shall be able to identify palm tree diseases, palm tree gender, and species of young date palm trees by scanning leaves.
This paper reports a first microbial biosensor for rapid and cost-effective determination of organophosphorus pesticides fenitrothion and EPN. The biosensor consisted of recombinant PNP-degrading/oxidizing bacteria Pseudomonas putida JS444 anchoring and displaying organophosphorus hydrolase (OPH) on its cell surface as biological sensing element and a dissolved oxygen electrode as the transducer. Surfaceexpressed OPH catalyzed the hydrolysis of fenitrothion and EPN to release 3-methyl-4-nitrophenol and p-nitrophenol, respectively, which were oxidized by the enzymatic machinery of Pseudomonas putida JS444 to carbon dioxide while consuming oxygen, which was measured and correlated to the concentration of organophosphates. Under the optimum operating conditions, the biosensor was able to measure as low as 277 ppb of fenitrothion and 1.6 ppm of EPN without interference from phenolic compounds and other commonly used pesticides such as carbamate pesticides, triazine herbicides and organophosphate pesticides without nitrophenyl substituent. The applicability of the biosensor to lake water was also demonstrated.
Proceedings of the International Conference on Material Theory and Nonlinear Dynamics. MatDyn. Hanoi, Vietnam, Sept. 24-26, 2007, 8 p. In this paper, a method is introduced to determine the limit load of general shells using the finite element method. The method is based on an upper bound limit and shakedown analysis with elastic-perfectly plastic material model. A non-linear constrained optimisation problem is solved by using Newton’s method in conjunction with a penalty method and the Lagrangean dual method. Numerical investigation of a pipe bend subjected to bending moments proves the effectiveness of the algorithm.
The integration of frequently changing, volatile product data from different manufacturers into a single catalog is a significant challenge for small and medium-sized e-commerce companies. They rely on timely integrating product data to present them aggregated in an online shop without knowing format specifications, concept understanding of manufacturers, and data quality. Furthermore, format, concepts, and data quality may change at any time. Consequently, integrating product catalogs into a single standardized catalog is often a laborious manual task. Current strategies to streamline or automate catalog integration use techniques based on machine learning, word vectorization, or semantic similarity. However, most approaches struggle with low-quality or real-world data. We propose Attribute Label Ranking (ALR) as a recommendation engine to simplify the integration process of previously unknown, proprietary tabular format into a standardized catalog for practitioners. We evaluate ALR by focusing on the impact of different neural network architectures, language features, and semantic similarity. Additionally, we consider metrics for industrial application and present the impact of ALR in production and its limitations.
The integration of product data from heterogeneous sources and manufacturers into a single catalog is often still a laborious, manual task. Especially small- and medium-sized enterprises face the challenge of timely integrating the data their business relies on to have an up-to-date product catalog, due to format specifications, low quality of data and the requirement of expert knowledge. Additionally, modern approaches to simplify catalog integration demand experience in machine learning, word vectorization, or semantic similarity that such enterprises do not have. Furthermore, most approaches struggle with low-quality data. We propose Attribute Label Ranking (ALR), an easy to understand and simple to adapt learning approach. ALR leverages a model trained on real-world integration data to identify the best possible schema mapping of previously unknown, proprietary, tabular format into a standardized catalog schema. Our approach predicts multiple labels for every attribute of an inpu t column. The whole column is taken into consideration to rank among these labels. We evaluate ALR regarding the correctness of predictions and compare the results on real-world data to state-of-the-art approaches. Additionally, we report findings during experiments and limitations of our approach.
Chemische Sensoren mit Bariumstrontiumtitanat als funktionelle Schicht zur Multiparameterdetektion
(2013)
Applications of Graph Transformations with Industrial Relevance Lecture Notes in Computer Science, 2004, Volume 3062/2004, 434-439, DOI: http://dx.doi.org/10.1007/978-3-540-25959-6_33 This paper gives a brief overview of the tools we have developed to support conceptual design in civil engineering. Based on the UPGRADE framework, two applications, one for the knowledge engineer and another for architects allow to store domain specific knowledge and to use this knowledge during conceptual design. Consistency analyses check the design against the defined knowledge and inform the architect if rules are violated.
In the research domain of energy informatics, the importance of open datais rising rapidly. This can be seen as various new public datasets are created andpublished. Unfortunately, in many cases, the data is not available under a permissivelicense corresponding to the FAIR principles, often lacking accessibility or reusability.Furthermore, the source format often differs from the desired data format or does notmeet the demands to be queried in an efficient way. To solve this on a small scale atoolbox for ETL-processes is provided to create a local energy data server with openaccess data from different valuable sources in a structured format. So while the sourcesitself do not fully comply with the FAIR principles, the provided unique toolbox allows foran efficient processing of the data as if the FAIR principles would be met. The energydata server currently includes information of power systems, weather data, networkfrequency data, European energy and gas data for demand and generation and more.However, a solution to the core problem - missing alignment to the FAIR principles - isstill needed for the National Research Data Infrastructure.
Design and implementation aspects of a 3D reconstruction algorithm for the Jülich TierPET system
(1997)
This paper presents the direct route to Design by Analysis (DBA) of the new European pressure vessel standard in the language of limit and shakedown analysis (LISA). This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for some examples from the DBA-Manual. One observation from the examples is, that the optimisation approach gives reliable and close lower bound solutions leading to simple and optimised design decision.
Detection of Adrenaline Based on Bioelectrocatalytical System to Support Tumor Diagnostic Technology
(2017)
An H2O2 sensor for the application in industrial sterilisation processes has been developed. Therefore, automated sterilisation equipment at laboratory scale has been constructed using parts from industrial sterilisation facilities. In addition, a software tool has been developed for the control of the sterilisation equipment at laboratory scale. First measurements with the developed sensor set-up as part of the sterilisation equipment have been performed and the sensor has been physically characterised by optical microscopy and SEM.
We present the novel concept of a combined drilling and melting probe for subsurface ice research. This probe, named “IceMole”, is currently developed, built, and tested at the FH Aachen University of Applied Sciences’ Astronautical Laboratory. Here, we describe its first prototype design and report the results of its field tests on the Swiss Morteratsch glacier. Although the IceMole design is currently adapted to terrestrial glaciers and ice shields, it may later be modified for the subsurface in-situ investigation of extraterrestrial ice, e.g., on Mars, Europa, and Enceladus. If life exists on those bodies, it may be present in the ice (as life can also be found in the deep ice of Earth).
The discovery of human induced pluripotent stem cells reprogrammed from somatic cells [1] and their ability to differentiate into cardiomyocytes (hiPSC-CMs) has provided a robust platform for drug screening [2]. Drug screenings are essential in the development of new components, particularly for evaluating the potential of drugs to induce life-threatening pro-arrhythmias. Between 1988 and 2009, 14 drugs have been removed from the market for this reason [3]. The microelectrode array (MEA) technique is a robust tool for drug screening as it detects the field potentials (FPs) for the entire cell culture. Furthermore, the propagation of the field potential can be examined on an electrode basis. To analyze MEA measurements in detail, we have developed an open-source tool.
Abstracts of the ACHEMA 2000 - International Meeting on Chemical Engineering, Environmental Protection and Biotechnology, May 22 - 27, 2000. Frankfurt am Main. Achema 2000 : special edition / Linde. [Ed.: Linde AG. Red.: Volker R. Leski]. - Wiesbaden : Linde AG, 2000. - 56 p. : Ill., . - pp: 79 - 81
We propose a stochastic programming method to analyse limit and shakedown of structures under random strength with lognormal distribution. In this investigation a dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit or the shakedown limit. The edge-based smoothed finite element method (ES-FEM) using three-node linear triangular elements is used.
Safety and reliability of structures may be assessed indirectly by stress distributions. Limit and shakedown theorems are simplified but exact methods of plasticity that provide safety factors directly in the loading space. These theorems may be used for a direct definition of the limit state function for failure by plastic collapse or by inadaptation. In a FEM formulation the limit state function is obtained from a nonlinear optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error. Moreover, the direct way leads to highly effective and precise reliability analyses. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis.
Genaue Kenntnis der Spannungen und Verformungen in passiven Komponenten gewinnt man mit detailierten inelastischen FEM Analysen. Die lokale Beanspruchung läßt sich aber nicht direkt mit einer Beanspruchbarkeit im strukturmechanischen Sinne vergleichen. Konzentriert man sich auf die Frage nach der Tragfähigkeit, dann vereinfacht sich die Analyse. Im Rahmen der Plastizitätstheorie berechnen Traglast- und Einspielanalyse die tragbaren Lasten direkt und exakt. In diesem Beitrag wird eine Implementierung der Traglast- und Einspielsätze in ein allgemeines FEM Programm vorgestellt, mit der die Tragfähigkeit passiver Komponenten direkt berechnet wird. Die benutzten Konzepte werden in Bezug auf die übliche Strukturanalyse erläutert. Beispiele mit lokal hoher Beanspruchung verdeutlichen die Anwendung der FEM basierten Traglast- und Einspielanalysen. Die berechneten Interaktionsdiagramme geben einen guten Überblick über die möglichen Betriebsbereiche passiver Komponenten. Die Traglastanalyse bietet auch einen strukturmechanischen Zugang zur Kollapslast rißbehafteter Komponenten aus hochzähem Material.
DNA-hybridization detection using light-addressable potentiometric sensor modified with gold layer
(2014)
Conventional EEG devices cannot be used in everyday life and hence, past decade research has been focused on Ear-EEG for mobile, at-home monitoring for various applications ranging from emotion detection to sleep monitoring. As the area available for electrode contact in the ear is limited, the electrode size and location play a vital role for an Ear-EEG system. In this investigation, we present a quantitative study of ear-electrodes with two electrode sizes at different locations in a wet and dry configuration. Electrode impedance scales inversely with size and ranges from 450 kΩ to 1.29 MΩ for dry and from 22 kΩ to 42 kΩ for wet contact at 10 Hz. For any size, the location in the ear canal with the lowest impedance is ELE (Left Ear Superior), presumably due to increased contact pressure caused by the outer-ear anatomy. The results can be used to optimize signal pickup and SNR for specific applications. We demonstrate this by recording sleep spindles during sleep onset with high quality (5.27 μVrms).
Pulmonary arterial cannulation is a common and effective method for percutaneous mechanical circulatory support for concurrent right heart and respiratory failure [1]. However, limited data exists to what effect the positioning of the cannula has on the oxygen perfusion throughout the pulmonary artery (PA). This study aims to evaluate, using computational fluid dynamics (CFD), the effect of different cannula positions in the PA with respect to the oxygenation of the different branching vessels in order for an optimal cannula position to be determined. The four chosen different positions (see Fig. 1) of the cannulas are, in the lower part of the main pulmonary artery (MPA), in the MPA at the junction between the right pulmonary artery (RPA) and the left pulmonary artery (LPA), in the RPA at the first branch of the RPA and in the LPA at the first branch of the LPA.
As a deduction from these results, we can conclude that proteins mainly in vitro, denaturate totally at a temperature between 57°C -62°C, and they also affected by NO and different ions types. In which mainly, NO cause earlier protein denaturation, which means that, NO has a destabilizing effect on proteins, and also different ions will alter the protein denaturation in which, some ions will cause earlier protein denaturation while others not.
Effectiveness of the edge-based smoothed finite element method applied to soft biological tissues
(2012)
Summary and Conclusions PCIs were clearly effective in terms of their antibacterial effects with the strains tested. This efficacy increased with the time the bacteries were exposed to PCIs. The bactericidal action has proved to be irreversible. PCIs were significantly less effective in shadowed areas. PCI exposure caused multiple protein damages as observed in SDS PAGE studies. There was no single but multiple molecular mechanism causing the bacterial death.
Traglast- und Einspielanalysen sind vereinfachte doch exakte Verfahren der klassischen Plastizitätstheorie, die neben ausreichender Verformbarkeit keine einschränkenden Voraussetzungen beinhalten. Die Vereinfachungen betreffen die Beschaffung der Daten und Modelle für Details der Lastgeschichte und des Stoffverhaltens. Eine FEM-basierte Traglast- und Einspielanalyse für ideal plastisches Material wurde auf ein kinematisch verfestigendes Materialgesetz erweitert und in das Finite Element Programm PERMAS implementiert. In einem einfachen Zug-Torsionsexperiment wurde eine Hohlprobe mit konstanter Torsion und zyklischer Zugbelastung beansprucht, um die neue Implementierung zu verifizieren. Es konnte gezeigt werden, dass die Einspielanalyse gut mit den experimentellen Ergebnissen übereinstimmt. Bei Verfestigung lassen sich wesentlich größere Sicherheiten nachweisen. Dieses Potential bedarf weiterer experimenteller Absicherung. Parallel dazu ist die Eisnpieltheorie auf fortschrittliche Verfestigungsansätze zu erweitern.
Electromechanical model of hiPSC-derived ventricular cardiomyocytes cocultured with fibroblasts
(2018)
The CellDrum provides an experimental setup to study the mechanical effects of fibroblasts co-cultured with hiPSC-derived ventricular cardiomyocytes. Multi-scale computational models based on the Finite Element Method are developed. Coupled electrical cardiomyocyte-fibroblast models (cell level) are embedded into reaction-diffusion equations (tissue level) which compute the propagation of the action potential in the cardiac tissue. Electromechanical coupling is realised by an excitation-contraction model (cell level) and the active stress arising during contraction is added to the passive stress in the force balance, which determines the tissue displacement (tissue level). Tissue parameters in the model can be identified experimentally to the specific sample.
Beim Ausbau nachhaltiger, regenerativer Energieversorgung hat die Umwandlung von organischer Biomasse in Biogas ein großes Potential. Der zugrundeliegende, komplexe biologische Prozess wird noch immer unzureichend verstanden und bedarf systematischer Untersuchungen der Prozessparameter, um einen hohen Ertrag bei guter Gasqualität zu ermöglichen. Die Fragestellungen zur Entschlüsselung des Prozesses sind sowohl verfahrenstechnischer als auch mikrobiologischer Natur. Aus mikrobiologischer Sicht ist die Kenntnis der tatsächlich beteiligten prozesstragenden Mikroorganismen von erheblicher Bedeutung, aus verfahrenstechnischer Sicht die Kenntnis der physikalischen und chemischen Faktoren, welche die mikrobiologischen Prozesse und kontrollieren. Im Zusammenspiel aller dieser Parameter wird die Biogasbildung befördert oder behindert, bis zum Abbruch des Prozesses.
Eine mögliche Kontrollmethode ist die Messung der metabolischen Aktivität prozesstragender Organismen.
Diese soll, beruhend auf fundierten Prozessdaten, gewonnen durch eine Parallelanlage, mit einem lichtadressierbaren potentiometrischen Sensor-System (LAPS) realisiert werden. Dieser Sensor ist in der Lage, pH-Wert-änderungen zu detektieren, die durch den Stoffwechsel der auf dem Chip immobilisierten Organismen hervorgerufen werden, um eine Online-Überwachung von Biogasanlagen zu ermöglichen.
In diesem Beitrag werden Ergebnisse der Entwicklung eines modularen festkörperbasierten Sensorsystems für die Überwachung von Zellkulturfermentationen präsentiert. Zur Messung der Elektrolytleitfähigkeit wurde das Layout von Interdigitalelektroden angepasst, um in vergleichsweise gut leitenden Elektrolyten zu messen. Durch Quervernetzung von Glucose-Oxidase mit Glutaraldehyd und Immobilisierung auf einer Platinelektrode wurde ein amperometrischer Glucosesensor mit einem linearen Messbereich von bis zu 2 mM und einer Sensitivität von 168 nA/mM realisiert.
We compare four different algorithms for automatically estimating the muscle fascicle angle from ultrasonic images: the vesselness filter, the Radon transform, the projection profile method and the gray level cooccurence matrix (GLCM). The algorithm results are compared to ground truth data generated by three different experts on 425 image frames from two videos recorded during different types of motion. The best agreement with the ground truth data was achieved by a combination of pre-processing with a vesselness filter and measuring the angle with the projection profile method. The robustness of the estimation is increased by applying the algorithms to subregions with high gradients and performing a LOESS fit through these estimates.
Summary: This paper presents a methodology to study and understand the mechanics of stapled anastomotic behaviors by combining empirical experimentation and finite element analysis. Performance of stapled anastomosis is studied in terms of leakage and numerical results which are compared to in vitro experiments performed on fresh porcine tissue. Results suggest that leaks occur between the tissue and staple legs penetrating through the tissue.
In recent years, the development of large pretrained language models, such as BERT and GPT, significantly improved information extraction systems on various tasks, including relation classification. State-of-the-art systems are highly accurate on scientific benchmarks. A lack of explainability is currently a complicating factor in many real-world applications. Comprehensible systems are necessary to prevent biased, counterintuitive, or harmful decisions.
We introduce semantic extents, a concept to analyze decision patterns for the relation classification task. Semantic extents are the most influential parts of texts concerning classification decisions. Our definition allows similar procedures to determine semantic extents for humans and models. We provide an annotation tool and a software framework to determine semantic extents for humans and models conveniently and reproducibly. Comparing both reveals that models tend to learn shortcut patterns from data. These patterns are hard to detect with current interpretability methods, such as input reductions. Our approach can help detect and eliminate spurious decision patterns during model development. Semantic extents can increase the reliability and security of natural language processing systems. Semantic extents are an essential step in enabling applications in critical areas like healthcare or finance. Moreover, our work opens new research directions for developing methods to explain deep learning models.
Messenger apps like WhatsApp or Telegram are an integral part of daily communication. Besides the various positive effects, those services extend the operating range of criminals. Open trading groups with many thousand participants emerged on Telegram. Law enforcement agencies monitor suspicious users in such chat rooms. This research shows that text analysis, based on natural language processing, facilitates this through a meaningful domain overview and detailed investigations. We crawled a corpus from such self-proclaimed black markets and annotated five attribute types products, money, payment methods, user names, and locations. Based on each message a user sends, we extract and group these attributes to build profiles. Then, we build features to cluster the profiles. Pretrained word vectors yield better unsupervised clustering results than current
state-of-the-art transformer models. The result is a semantically meaningful high-level overview of the user landscape of black market chatrooms. Additionally, the extracted structured information serves as a foundation for further data exploration, for example, the most active users or preferred payment methods.
GaAs-based Gunn diodes with graded AlGaAs hot electron injector heterostructures have been developed under the special needs in automotive applications. The fabrication of the Gunn diode chips was based on total substrate removal and processing of integrated Au heat sinks. Especially, the thermal and RF behavior of the diodes have been analyzed by DC, impedance and S-parameter measurements. The electrical investigations have revealed the functionality of the hot electron injector. An optimized layer structure could fulfill the requirements in adaptive cruise control (ACC) systems at 77 GHz with typical output power between 50 and 90 mW.
A new formulation to calculate the shakedown limit load of Kirchhoff plates under stochastic conditions of strength is developed. Direct structural reliability design by chance con-strained programming is based on the prescribed failure probabilities, which is an effective approach of stochastic programming if it can be formulated as an equivalent deterministic optimization problem. We restrict uncertainty to strength, the loading is still deterministic. A new formulation is derived in case of random strength with lognormal distribution. Upper bound and lower bound shakedown load factors are calculated simultaneously by a dual algorithm.
The structure of the female pelvic floor (PF) is an inter-related system of bony pelvis,muscles, pelvic organs, fascias, ligaments, and nerves with multiple functions. Mechanically, thepelvic organ support system are of two types: (I) supporting system of the levator ani (LA) muscle,and (II) the suspension system of the endopelvic fascia condensation [1], [2]. Significantdenervation injury to the pelvic musculature, depolimerization of the collagen fibrils of the softvaginal hammock, cervical ring and ligaments during pregnancy and vaginal delivery weakens thenormal functions of the pelvic floor. Pelvic organ prolapse, incontinence, sexual dysfunction aresome of the dysfunctions which increases progressively with age and menopause due toweakened support system according to the Integral theory [3]. An improved 3D finite elementmodel of the female pelvic floor as shown in Fig. 1 is constructed that: (I) considers the realisticsupport of the organs to the pelvic side walls, (II) employs the improvement of our previous FEmodel [4], [5] along with the patient based geometries, (III) incorporates the realistic anatomy andboundary conditions of the endopelvic (pubocervical and rectovaginal) fascia, and (IV) considersvarying stiffness of the endopelvic fascia in the craniocaudal direction [3]. Several computationsare carried out on the presented computational model with healthy and damaged supportingtissues, and comparisons are made to understand the physiopathology of the female PF disorders.
Mechanical stimulation of the cells resulted in evident changes in the cell morphology, protein composition and gene expression. Microscopically, additional formation of stress fibers accompanied by cell re-arrangements in a monolayer was observed. Also, significant activation of p53 gene was revealed as compared to control. Interestingly, the use of CellTech membrane coating induced cell death after mechanical stress had been applied. Such an effect was not detected when fibronectin had been used as an adhesion substrate.
A procedure for the evaluation of the failure probability of elastic-plastic thin shell structures is presented. The procedure involves a deterministic limit and shakedown analysis for each probabilistic iteration which is based on the kinematical approach and the use the exact Ilyushin yield surface. Based on a direct definition of the limit state function, the non-linear problems may be efficiently solved by using the First and Second Order Reliabiblity Methods (Form/SORM). This direct approach reduces considerably the necessary knowledge of uncertain technological input data, computing costs and the numerical error. In: Computational plasticity / ed. by Eugenio Onate. Dordrecht: Springer 2007. VII, 265 S. (Computational Methods in Applied Sciences ; 7) (COMPLAS IX. Part 1 . International Center for Numerical Methods in Engineering (CIMNE)). ISBN 978-1-402-06576-7 S. 186-189