Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (241) (remove)
Document Type
- Conference Proceeding (241) (remove)
Keywords
- Biosensor (25)
- CAD (11)
- Finite-Elemente-Methode (11)
- civil engineering (11)
- Bauingenieurwesen (10)
- Einspielen <Werkstoff> (6)
- shakedown analysis (6)
- Clusterion (4)
- Limit analysis (4)
- Natural language processing (4)
- limit analysis (4)
- Air purification (3)
- Hämoglobin (3)
- Luftreiniger (3)
- Plasmacluster ion technology (3)
- Raumluft (3)
- Shakedown (3)
- Shakedown analysis (3)
- Sonde (3)
- Traglast (3)
- Bruchmechanik (2)
- Clustering (2)
- Einspielanalyse (2)
- Eisschicht (2)
- Erythrozyt (2)
- FEM (2)
- Information extraction (2)
- Kohlenstofffaser (2)
- Lipopolysaccharide (2)
- Ratcheting (2)
- Stickstoffmonoxid (2)
- Traglastanalyse (2)
- biosensor (2)
- celldrum technology (2)
- lipopolysaccharides (2)
- nitric oxide gas (2)
- ratchetting (2)
- shakedown (2)
- 3-nitrofluoranthene (1)
- Active learning (1)
- Adsorption (1)
- Agent-based simulation (1)
- Analytischer Zulaessigkeitsnachweis (1)
- Anastomose (1)
- Anastomosis (1)
- Autofluoreszenzverfahren (1)
- BTEX compounds (1)
- Bakterien (1)
- Bio-Sensors (1)
- Biomechanics (1)
- Biomechanik (1)
- Biomedizinische Technik (1)
- Biophoton (1)
- Biosensorik (1)
- Blitzschutz (1)
- CAD ; (1)
- CO (1)
- Chance constrained programming (1)
- Cloud Computing (1)
- Cloud Service Broker (1)
- Conducing polymer (1)
- Dattel (1)
- Deep learning (1)
- Dekontamination (1)
- Druckbeanspruchung (1)
- Druckbehälter (1)
- Druckbelastung (1)
- ECT (1)
- EEG (1)
- EPN (1)
- Einspiel-Analyse (1)
- Elastodynamik (1)
- Elektrodynamik (1)
- Endothelzelle (1)
- Energy market design (1)
- Evolution of damage (1)
- Exact Ilyushin yield surface (1)
- Extension fracture (1)
- Extension strain criterion (1)
- FEM-Programm (1)
- FEM-computation (1)
- Fehlerstellen (1)
- Festkörper (1)
- Fibroblast (1)
- Finite element method (1)
- First Order Reliabiblity Method (1)
- First-order reliability method (1)
- Fluorescence (1)
- Force (1)
- GaAs hot electron injector (1)
- Gas sensor (1)
- Grid Computing (1)
- Gunn diode (1)
- Heavy metal detection (1)
- High throughput experimentation (1)
- Hotplate (1)
- Human Factors (1)
- Hydrodynamik (1)
- Hydrogel (1)
- Hydrogen sensor (1)
- I3S 2005 (1)
- ISFET (1)
- Impedance Spectroscopy (1)
- Information Extraction (1)
- Information Integration Tools (1)
- International Symposium on Sensor Science (1)
- Iterative learning control (1)
- Knee (1)
- Knowledge Management (1)
- Körpertemperatur (1)
- LED chip (1)
- LISA (1)
- Level sensor (1)
- Lichtstreuungsbasierte Instrumente (1)
- Load modeling (1)
- MEMS (1)
- Machine learning (1)
- Main sensitivity (1)
- Market modeling (1)
- Mechanische Beanspruchung (1)
- Microreactors (1)
- Mohr–Coulomb criterion (1)
- Multi-dimensional wave propagation (1)
- Nano Materials (1)
- Nanomaterial (1)
- Nanoparticles (1)
- Nanopartikel (1)
- Nanostructuring (1)
- Nanotechnologie (1)
- Nanotechnology ; Microelectronics ; Biosensors ; Superconductor ; MEMS (1)
- Natriumhypochlorit (1)
- Natural Language Processing (1)
- Natural language understanding (1)
- Nichtlineare Gleichung (1)
- Nichtlineare Optimierung (1)
- Nichtlineare Welle (1)
- Ontologie <Wissensverarbeitung> (1)
- Ontology Engineering (1)
- Organophosphorus (1)
- Ostazine Orange (1)
- PFM (1)
- Pflanzenphysiologie (1)
- Pflanzenscanner (1)
- Phenylalanine determination (1)
- Potentiometry (1)
- Process model (1)
- Profile Extraction (1)
- Profile extraction (1)
- Proteine (1)
- Pseudomonas putida (1)
- Quartz crystal nanobalance (QCN) (1)
- Quartz micro balances (1)
- Query learning (1)
- Random variable (1)
- Reaction-diffusion (1)
- Relation classification (1)
- Reliability of structures (1)
- Reproducible research (1)
- Rohr (1)
- Rohrbruch (1)
- Sensitivity (1)
- Sepsis (1)
- Sleep EEG (1)
- Solid amalgam electrodes (1)
- Stahl (1)
- Stochastic programming (1)
- Supraleiter (1)
- Technische Mechanik (1)
- Text Mining (1)
- Text mining (1)
- Tin oxide (1)
- Tobacco mosaic virus (1)
- Torsion (1)
- Torsionsbelastung (1)
- Tragfähigkeit (1)
- Training (1)
- Trustworthy artificial intelligence (1)
- UML (1)
- Unified Modeling Language (1)
- Wafer (1)
- Wasserbrücke (1)
- Wasserstoffperoxid (1)
- Wellen (1)
- Workflow (1)
- Workflow Orchestration (1)
- Zug-Druck-Beanspruchung (1)
- Zug-Druck-Belastung (1)
- acetoin (1)
- activated nanostructured carbon (1)
- aktivierte nanostrukturierte Kohlenstofffaser (1)
- ammonia gas sensors (1)
- amperometric sensor (1)
- antimony doped tin oxide (1)
- autofluorescence-based detection system (1)
- biopotential electrodes (1)
- burst pressure (1)
- burst tests (1)
- capacitive field-effect biosensor (1)
- capillary micro-droplet cell (1)
- carcinogens (1)
- catalytic decomposition (1)
- chemical reduction method (1)
- contractile tension (1)
- cross sensitivity (1)
- cytosolic water diffusion (1)
- date palm tree (1)
- design-by-analysis (1)
- doped metal oxide (1)
- doped silicon (1)
- doping (1)
- electrical capacitance tomography (1)
- electro-migration (1)
- electronic noses dendronized polymers inverted mesa technology (1)
- enzymatic methods (1)
- enzyme immobilisation (1)
- enzyme immobilization (1)
- fenitrothion (1)
- finite element analysis (1)
- flaw (1)
- fluidic (1)
- gas sensor (1)
- gas sensor array (1)
- heater metallisation (1)
- hemoglobin (1)
- hemoglobin dynamics (1)
- high-temperature stability (1)
- humidity (1)
- hydrogel (1)
- hydrogen peroxide (1)
- image sensor (1)
- imaging (1)
- impedance spectroscopy (1)
- ion-selective electrodes (1)
- kontraktile Spannung (1)
- lab-on-a-chip (1)
- lab-on-chip (1)
- layer expansion (1)
- lenslet array (1)
- light scattering analysis (1)
- lightning flash (1)
- limit and shakedown analysis (1)
- limit load (1)
- linear kinematic hardening (1)
- load carrying capacity (1)
- load limit (1)
- lower bound theorem (1)
- magnetic particles (1)
- material shakedown (1)
- matrix method (1)
- mechanical waves (1)
- metal oxide (1)
- microreactor (1)
- microwave generation (1)
- modeling biosensor (1)
- modelling (1)
- modified electrode (1)
- multi-interface measurement (1)
- nanostructured carbonized plant parts (1)
- nanostrukturierte carbonisierte Pflanzenteile (1)
- nitrogen oxides (1)
- nonlinear kinematic hardening (1)
- nonlinear optimization (1)
- nonlinear solids (1)
- nonlinear tensor constitutive equation (1)
- organic PVC membranes (1)
- pH-based biosensing (1)
- pattern-size reduction (1)
- pipes (1)
- plant scanner (1)
- plasma generated ions (1)
- plastic deformation (1)
- polymer composites (1)
- porous Pt electrode (1)
- principal component (1)
- probabilistic fracture mechanics (1)
- protein (1)
- quantum charging (1)
- reliability (1)
- rhAPC (1)
- screen-printing (1)
- second-order reliability method (1)
- self-aligned patterning (1)
- sensing properties (1)
- sensors (1)
- sterilisation (1)
- subsurface ice research (1)
- subsurface probe (1)
- surface modification (1)
- swift heavy ions (1)
- tension–torsion loading (1)
- thick-film technology (1)
- thin-film microsensors (1)
- vessels (1)
- voltammetry (1)
- wafer-level testing (1)
- water bridge phenomenon (1)
Conventional EEG devices cannot be used in everyday life and hence, past decade research has been focused on Ear-EEG for mobile, at-home monitoring for various applications ranging from emotion detection to sleep monitoring. As the area available for electrode contact in the ear is limited, the electrode size and location play a vital role for an Ear-EEG system. In this investigation, we present a quantitative study of ear-electrodes with two electrode sizes at different locations in a wet and dry configuration. Electrode impedance scales inversely with size and ranges from 450 kΩ to 1.29 MΩ for dry and from 22 kΩ to 42 kΩ for wet contact at 10 Hz. For any size, the location in the ear canal with the lowest impedance is ELE (Left Ear Superior), presumably due to increased contact pressure caused by the outer-ear anatomy. The results can be used to optimize signal pickup and SNR for specific applications. We demonstrate this by recording sleep spindles during sleep onset with high quality (5.27 μVrms).
DNA-hybridization detection using light-addressable potentiometric sensor modified with gold layer
(2014)
Genaue Kenntnis der Spannungen und Verformungen in passiven Komponenten gewinnt man mit detailierten inelastischen FEM Analysen. Die lokale Beanspruchung läßt sich aber nicht direkt mit einer Beanspruchbarkeit im strukturmechanischen Sinne vergleichen. Konzentriert man sich auf die Frage nach der Tragfähigkeit, dann vereinfacht sich die Analyse. Im Rahmen der Plastizitätstheorie berechnen Traglast- und Einspielanalyse die tragbaren Lasten direkt und exakt. In diesem Beitrag wird eine Implementierung der Traglast- und Einspielsätze in ein allgemeines FEM Programm vorgestellt, mit der die Tragfähigkeit passiver Komponenten direkt berechnet wird. Die benutzten Konzepte werden in Bezug auf die übliche Strukturanalyse erläutert. Beispiele mit lokal hoher Beanspruchung verdeutlichen die Anwendung der FEM basierten Traglast- und Einspielanalysen. Die berechneten Interaktionsdiagramme geben einen guten Überblick über die möglichen Betriebsbereiche passiver Komponenten. Die Traglastanalyse bietet auch einen strukturmechanischen Zugang zur Kollapslast rißbehafteter Komponenten aus hochzähem Material.
Safety and reliability of structures may be assessed indirectly by stress distributions. Limit and shakedown theorems are simplified but exact methods of plasticity that provide safety factors directly in the loading space. These theorems may be used for a direct definition of the limit state function for failure by plastic collapse or by inadaptation. In a FEM formulation the limit state function is obtained from a nonlinear optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error. Moreover, the direct way leads to highly effective and precise reliability analyses. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis.
We propose a stochastic programming method to analyse limit and shakedown of structures under random strength with lognormal distribution. In this investigation a dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit or the shakedown limit. The edge-based smoothed finite element method (ES-FEM) using three-node linear triangular elements is used.
Abstracts of the ACHEMA 2000 - International Meeting on Chemical Engineering, Environmental Protection and Biotechnology, May 22 - 27, 2000. Frankfurt am Main. Achema 2000 : special edition / Linde. [Ed.: Linde AG. Red.: Volker R. Leski]. - Wiesbaden : Linde AG, 2000. - 56 p. : Ill., . - pp: 79 - 81
The discovery of human induced pluripotent stem cells reprogrammed from somatic cells [1] and their ability to differentiate into cardiomyocytes (hiPSC-CMs) has provided a robust platform for drug screening [2]. Drug screenings are essential in the development of new components, particularly for evaluating the potential of drugs to induce life-threatening pro-arrhythmias. Between 1988 and 2009, 14 drugs have been removed from the market for this reason [3]. The microelectrode array (MEA) technique is a robust tool for drug screening as it detects the field potentials (FPs) for the entire cell culture. Furthermore, the propagation of the field potential can be examined on an electrode basis. To analyze MEA measurements in detail, we have developed an open-source tool.
We present the novel concept of a combined drilling and melting probe for subsurface ice research. This probe, named “IceMole”, is currently developed, built, and tested at the FH Aachen University of Applied Sciences’ Astronautical Laboratory. Here, we describe its first prototype design and report the results of its field tests on the Swiss Morteratsch glacier. Although the IceMole design is currently adapted to terrestrial glaciers and ice shields, it may later be modified for the subsurface in-situ investigation of extraterrestrial ice, e.g., on Mars, Europa, and Enceladus. If life exists on those bodies, it may be present in the ice (as life can also be found in the deep ice of Earth).
An H2O2 sensor for the application in industrial sterilisation processes has been developed. Therefore, automated sterilisation equipment at laboratory scale has been constructed using parts from industrial sterilisation facilities. In addition, a software tool has been developed for the control of the sterilisation equipment at laboratory scale. First measurements with the developed sensor set-up as part of the sterilisation equipment have been performed and the sensor has been physically characterised by optical microscopy and SEM.
Detection of Adrenaline Based on Bioelectrocatalytical System to Support Tumor Diagnostic Technology
(2017)
This paper presents the direct route to Design by Analysis (DBA) of the new European pressure vessel standard in the language of limit and shakedown analysis (LISA). This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for some examples from the DBA-Manual. One observation from the examples is, that the optimisation approach gives reliable and close lower bound solutions leading to simple and optimised design decision.
Design and implementation aspects of a 3D reconstruction algorithm for the Jülich TierPET system
(1997)
Applications of Graph Transformations with Industrial Relevance Lecture Notes in Computer Science, 2004, Volume 3062/2004, 434-439, DOI: http://dx.doi.org/10.1007/978-3-540-25959-6_33 This paper gives a brief overview of the tools we have developed to support conceptual design in civil engineering. Based on the UPGRADE framework, two applications, one for the knowledge engineer and another for architects allow to store domain specific knowledge and to use this knowledge during conceptual design. Consistency analyses check the design against the defined knowledge and inform the architect if rules are violated.
Chemische Sensoren mit Bariumstrontiumtitanat als funktionelle Schicht zur Multiparameterdetektion
(2013)
The integration of product data from heterogeneous sources and manufacturers into a single catalog is often still a laborious, manual task. Especially small- and medium-sized enterprises face the challenge of timely integrating the data their business relies on to have an up-to-date product catalog, due to format specifications, low quality of data and the requirement of expert knowledge. Additionally, modern approaches to simplify catalog integration demand experience in machine learning, word vectorization, or semantic similarity that such enterprises do not have. Furthermore, most approaches struggle with low-quality data. We propose Attribute Label Ranking (ALR), an easy to understand and simple to adapt learning approach. ALR leverages a model trained on real-world integration data to identify the best possible schema mapping of previously unknown, proprietary, tabular format into a standardized catalog schema. Our approach predicts multiple labels for every attribute of an inpu t column. The whole column is taken into consideration to rank among these labels. We evaluate ALR regarding the correctness of predictions and compare the results on real-world data to state-of-the-art approaches. Additionally, we report findings during experiments and limitations of our approach.
The integration of frequently changing, volatile product data from different manufacturers into a single catalog is a significant challenge for small and medium-sized e-commerce companies. They rely on timely integrating product data to present them aggregated in an online shop without knowing format specifications, concept understanding of manufacturers, and data quality. Furthermore, format, concepts, and data quality may change at any time. Consequently, integrating product catalogs into a single standardized catalog is often a laborious manual task. Current strategies to streamline or automate catalog integration use techniques based on machine learning, word vectorization, or semantic similarity. However, most approaches struggle with low-quality or real-world data. We propose Attribute Label Ranking (ALR) as a recommendation engine to simplify the integration process of previously unknown, proprietary tabular format into a standardized catalog for practitioners. We evaluate ALR by focusing on the impact of different neural network architectures, language features, and semantic similarity. Additionally, we consider metrics for industrial application and present the impact of ALR in production and its limitations.
Proceedings of the International Conference on Material Theory and Nonlinear Dynamics. MatDyn. Hanoi, Vietnam, Sept. 24-26, 2007, 8 p. In this paper, a method is introduced to determine the limit load of general shells using the finite element method. The method is based on an upper bound limit and shakedown analysis with elastic-perfectly plastic material model. A non-linear constrained optimisation problem is solved by using Newton’s method in conjunction with a penalty method and the Lagrangean dual method. Numerical investigation of a pipe bend subjected to bending moments proves the effectiveness of the algorithm.
This paper reports a first microbial biosensor for rapid and cost-effective determination of organophosphorus pesticides fenitrothion and EPN. The biosensor consisted of recombinant PNP-degrading/oxidizing bacteria Pseudomonas putida JS444 anchoring and displaying organophosphorus hydrolase (OPH) on its cell surface as biological sensing element and a dissolved oxygen electrode as the transducer. Surfaceexpressed OPH catalyzed the hydrolysis of fenitrothion and EPN to release 3-methyl-4-nitrophenol and p-nitrophenol, respectively, which were oxidized by the enzymatic machinery of Pseudomonas putida JS444 to carbon dioxide while consuming oxygen, which was measured and correlated to the concentration of organophosphates. Under the optimum operating conditions, the biosensor was able to measure as low as 277 ppb of fenitrothion and 1.6 ppm of EPN without interference from phenolic compounds and other commonly used pesticides such as carbamate pesticides, triazine herbicides and organophosphate pesticides without nitrophenyl substituent. The applicability of the biosensor to lake water was also demonstrated.
Tests with palm tree leaves have just started yet and scan data are in the process to be analyzed. The final goal of future project for palm tree gender and species recognition will be to develop optical scanning technology to be applied to date palm tree leaves for in–situ screening purposes. Depending on the software used and the particular requirements of the users the technology potentially shall be able to identify palm tree diseases, palm tree gender, and species of young date palm trees by scanning leaves.
Biomechanical simulation of different prosthetic meshes for repairing uterine/vaginal vault prolapse
(2017)
The overall objective of this study is to develop a new external fixator, which closely maps the native kinematics of the elbow to decrease the joint force resulting in reduced rehabilitation time and pain. An experimental setup was designed to determine the native kinematics of the elbow during flexion of cadaveric arms. As a preliminary study, data from literature was used to modify a published biomechanical model for the calculation of the joint and muscle forces. They were compared to the original model and the effect of the kinematic refinement was evaluated. Furthermore, the obtained muscle forces were determined in order to apply them in the experimental setup. The joint forces in the modified model differed slightly from the forces in the original model. The muscle force curves changed particularly for small flexion angles but their magnitude for larger angles was consistent.
Reliable methods for automatic readability assessment have the potential to impact a variety of fields, ranging from machine translation to self-informed learning. Recently, large language models for the German language (such as GBERT and GPT-2-Wechsel) have become available, allowing to develop Deep Learning based approaches that promise to further improve automatic readability assessment. In this contribution, we studied the ability of ensembles of fine-tuned GBERT and GPT-2-Wechsel models to reliably predict the readability of German sentences. We combined these models with linguistic features and investigated the dependence of prediction performance on ensemble size and composition. Mixed ensembles of GBERT and GPT-2-Wechsel performed better than ensembles of the same size consisting of only GBERT or GPT-2-Wechsel models. Our models were evaluated in the GermEval 2022 Shared Task on Text Complexity Assessment on data of German sentences. On out-of-sample data, our best ensemble achieved a root mean squared error of 0:435.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
Mathematical morphology is a part of image processing that has proven to be fruitful for numerous applications. Two main operations in mathematical morphology are dilation and erosion. These are based on the construction of a supremum or infimum with respect to an order over the tonal range in a certain section of the image. The tonal ordering can easily be realised in grey-scale morphology, and some morphological methods have been proposed for colour morphology. However, all of these have certain limitations.
In this paper we present a novel approach to colour morphology extending upon previous work in the field based on the Loewner order. We propose to consider an approximation of the supremum by means of a log-sum exponentiation introduced by Maslov. We apply this to the embedding of an RGB image in a field of symmetric 2x2 matrices. In this way we obtain nearly isotropic matrices representing colours and the structural advantage of transitivity. In numerical experiments we highlight some remarkable properties of the proposed approach.
An application of a scanning light-addressable potentiometric sensor for label-free DNA detection
(2013)
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".
An increasing number of applications target their executions on specific hardware like general purpose Graphics Processing Units. Some Cloud Computing providers offer this specific hardware so that organizations can rent such resources. However, outsourcing the whole application to the Cloud causes avoidable costs if only some parts of the application benefit from the specific expensive hardware. A partial execution of applications in the Cloud is a tradeoff between costs and efficiency. This paper addresses the demand for a consistent framework that allows for a mixture of on- and off-premise calculations by migrating only specific parts to a Cloud. It uses the concept of workflows to present how individual workflow tasks can be migrated to the Cloud whereas the remaining tasks are executed on-premise.