Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (210) (remove)
Language
- English (210) (remove)
Document Type
- Conference Proceeding (210) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Clusterion (4)
- Limit analysis (4)
- Natural language processing (4)
- Air purification (3)
- Einspielen <Werkstoff> (3)
- Hämoglobin (3)
- Luftreiniger (3)
- Plasmacluster ion technology (3)
- Raumluft (3)
- Shakedown analysis (3)
- Sonde (3)
- shakedown analysis (3)
- Bruchmechanik (2)
- Clustering (2)
- Eisschicht (2)
- Erythrozyt (2)
- FEM (2)
- Information extraction (2)
- Kohlenstofffaser (2)
- Lipopolysaccharide (2)
- Shakedown (2)
- Stickstoffmonoxid (2)
- biosensor (2)
- celldrum technology (2)
- limit analysis (2)
- lipopolysaccharides (2)
- nitric oxide gas (2)
- 3-nitrofluoranthene (1)
- Active learning (1)
- Adsorption (1)
- Agent-based simulation (1)
- Analytischer Zulaessigkeitsnachweis (1)
- Anastomose (1)
- Anastomosis (1)
- Autofluoreszenzverfahren (1)
- BTEX compounds (1)
- Bakterien (1)
- Bio-Sensors (1)
- Biomechanics (1)
- Biomechanik (1)
- Biomedizinische Technik (1)
- Biophoton (1)
- Biosensorik (1)
- Blitzschutz (1)
- CAD ; (1)
- CO (1)
- Chance constrained programming (1)
- Cloud Computing (1)
- Cloud Service Broker (1)
- Conducing polymer (1)
- Dattel (1)
- Deep learning (1)
- Dekontamination (1)
- ECT (1)
- EEG (1)
- EPN (1)
- Einspiel-Analyse (1)
- Einspielanalyse (1)
- Elastodynamik (1)
- Elektrodynamik (1)
- Endothelzelle (1)
- Energy market design (1)
- Evolution of damage (1)
- Exact Ilyushin yield surface (1)
- Extension fracture (1)
- Extension strain criterion (1)
- Festkörper (1)
- Fibroblast (1)
- Finite element method (1)
- First Order Reliabiblity Method (1)
- First-order reliability method (1)
- Fluorescence (1)
- Force (1)
- GaAs hot electron injector (1)
- Gas sensor (1)
- Grid Computing (1)
- Gunn diode (1)
- Heavy metal detection (1)
- High throughput experimentation (1)
- Hotplate (1)
- Hydrodynamik (1)
- Hydrogel (1)
- Hydrogen sensor (1)
- I3S 2005 (1)
- ISFET (1)
- Impedance Spectroscopy (1)
- Information Extraction (1)
- International Symposium on Sensor Science (1)
- Iterative learning control (1)
- Knee (1)
- Körpertemperatur (1)
- LED chip (1)
- Level sensor (1)
- Lichtstreuungsbasierte Instrumente (1)
- Load modeling (1)
- MEMS (1)
- Machine learning (1)
- Main sensitivity (1)
- Market modeling (1)
- Mechanische Beanspruchung (1)
- Microreactors (1)
- Mohr–Coulomb criterion (1)
- Multi-dimensional wave propagation (1)
- Nano Materials (1)
- Nanomaterial (1)
- Nanoparticles (1)
- Nanopartikel (1)
- Nanostructuring (1)
- Nanotechnologie (1)
- Nanotechnology ; Microelectronics ; Biosensors ; Superconductor ; MEMS (1)
- Natriumhypochlorit (1)
- Natural Language Processing (1)
- Natural language understanding (1)
- Nichtlineare Gleichung (1)
- Nichtlineare Optimierung (1)
- Nichtlineare Welle (1)
- Organophosphorus (1)
- Ostazine Orange (1)
- PFM (1)
- Pflanzenphysiologie (1)
- Pflanzenscanner (1)
- Phenylalanine determination (1)
- Potentiometry (1)
- Process model (1)
- Profile Extraction (1)
- Profile extraction (1)
- Proteine (1)
- Pseudomonas putida (1)
- Quartz crystal nanobalance (QCN) (1)
- Quartz micro balances (1)
- Query learning (1)
- Random variable (1)
- Ratcheting (1)
- Reaction-diffusion (1)
- Relation classification (1)
- Reliability of structures (1)
- Reproducible research (1)
- Sensitivity (1)
- Sepsis (1)
- Sleep EEG (1)
- Solid amalgam electrodes (1)
- Stochastic programming (1)
- Supraleiter (1)
- Technische Mechanik (1)
- Text Mining (1)
- Text mining (1)
- Tin oxide (1)
- Tobacco mosaic virus (1)
- Traglast (1)
- Traglastanalyse (1)
- Training (1)
- Trustworthy artificial intelligence (1)
- UML (1)
- Unified Modeling Language (1)
- Wafer (1)
- Wasserbrücke (1)
- Wasserstoffperoxid (1)
- Wellen (1)
- Workflow (1)
- Workflow Orchestration (1)
- acetoin (1)
- activated nanostructured carbon (1)
- aktivierte nanostrukturierte Kohlenstofffaser (1)
- ammonia gas sensors (1)
- amperometric sensor (1)
- antimony doped tin oxide (1)
- autofluorescence-based detection system (1)
- biopotential electrodes (1)
- capacitive field-effect biosensor (1)
- capillary micro-droplet cell (1)
- carcinogens (1)
- catalytic decomposition (1)
- chemical reduction method (1)
- contractile tension (1)
- cross sensitivity (1)
- cytosolic water diffusion (1)
- date palm tree (1)
- design-by-analysis (1)
- doped metal oxide (1)
- doped silicon (1)
- doping (1)
- electrical capacitance tomography (1)
- electro-migration (1)
- electronic noses dendronized polymers inverted mesa technology (1)
- enzymatic methods (1)
- enzyme immobilisation (1)
- enzyme immobilization (1)
- fenitrothion (1)
- finite element analysis (1)
- fluidic (1)
- gas sensor (1)
- gas sensor array (1)
- heater metallisation (1)
- hemoglobin (1)
- hemoglobin dynamics (1)
- high-temperature stability (1)
- humidity (1)
- hydrogel (1)
- hydrogen peroxide (1)
- image sensor (1)
- imaging (1)
- impedance spectroscopy (1)
- ion-selective electrodes (1)
- kontraktile Spannung (1)
- lab-on-a-chip (1)
- lab-on-chip (1)
- layer expansion (1)
- lenslet array (1)
- light scattering analysis (1)
- lightning flash (1)
- limit and shakedown analysis (1)
- limit load (1)
- linear kinematic hardening (1)
- lower bound theorem (1)
- magnetic particles (1)
- material shakedown (1)
- matrix method (1)
- mechanical waves (1)
- metal oxide (1)
- microreactor (1)
- microwave generation (1)
- modeling biosensor (1)
- modelling (1)
- modified electrode (1)
- multi-interface measurement (1)
- nanostructured carbonized plant parts (1)
- nanostrukturierte carbonisierte Pflanzenteile (1)
- nitrogen oxides (1)
- nonlinear kinematic hardening (1)
- nonlinear optimization (1)
- nonlinear solids (1)
- nonlinear tensor constitutive equation (1)
- organic PVC membranes (1)
- pH-based biosensing (1)
- pattern-size reduction (1)
- plant scanner (1)
- plasma generated ions (1)
- polymer composites (1)
- porous Pt electrode (1)
- principal component (1)
- probabilistic fracture mechanics (1)
- protein (1)
- quantum charging (1)
- ratchetting (1)
- reliability (1)
- rhAPC (1)
- screen-printing (1)
- second-order reliability method (1)
- self-aligned patterning (1)
- sensing properties (1)
- sensors (1)
- shakedown (1)
- sterilisation (1)
- subsurface ice research (1)
- subsurface probe (1)
- surface modification (1)
- swift heavy ions (1)
- thick-film technology (1)
- thin-film microsensors (1)
- voltammetry (1)
- wafer-level testing (1)
- water bridge phenomenon (1)
The progress in natural language processing (NLP) research over the last years, offers novel business opportunities for companies, as automated user interaction or improved data analysis. Building sophisticated NLP applications requires dealing with modern machine learning (ML) technologies, which impedes enterprises from establishing successful NLP projects. Our experience in applied NLP research projects shows that the continuous integration of research prototypes in production-like environments with quality assurance builds trust in the software and shows convenience and usefulness regarding the business goal. We introduce STAMP 4 NLP as an iterative and incremental process model for developing NLP applications. With STAMP 4 NLP, we merge software engineering principles with best practices from data science. Instantiating our process model allows efficiently creating prototypes by utilizing templates, conventions, and implementations, enabling developers and data scientists to focus on the business goals. Due to our iterative-incremental approach, businesses can deploy an enhanced version of the prototype to their software environment after every iteration, maximizing potential business value and trust early and avoiding the cost of successful yet never deployed experiments.
The workflow of a high throughput screening setup for the rapid identification of new and improved sensor materials is presented. The polyol method was applied to prepare nanoparticular metal oxides as base materials, which were functionalised by surface doping. Using multi-electrode substrates and high throughput impedance spectroscopy (HT-IS) a wide range of materials could be screened in a short time. Applying HT-IS in search of new selective gas sensing materials a NO2-tolerant NO sensing material with reduced sensitivities towards other test gases was identified based on iridium doped zinc oxide. Analogous behaviour was observed for iridium doped indium oxide.
Applications of Graph Transformations with Industrial Relevance Lecture Notes in Computer Science, 2004, Volume 3062/2004, 434-439, DOI: http://dx.doi.org/10.1007/978-3-540-25959-6_33 This paper gives a brief overview of the tools we have developed to support conceptual design in civil engineering. Based on the UPGRADE framework, two applications, one for the knowledge engineer and another for architects allow to store domain specific knowledge and to use this knowledge during conceptual design. Consistency analyses check the design against the defined knowledge and inform the architect if rules are violated.
In: Advances in intelligent computing in engineering : proceedings of the 9.International EG-ICE Workshop ; Darmstadt, (01 - 03 August) 2002 / Martina Schnellenbach-Held ... (eds.) . - Düsseldorf: VDI-Verl., 2002 .- Fortschritt-Berichte VDI, Reihe 4, Bauingenieurwesen ; 180 ; S. 1-35 The paper describes a novel way to support conceptual design in civil engineering. The designer uses semantical tools guaranteeing certain internal structures of the design result but also the fulfillment of various constraints. Two different approaches and corresponding tools are discussed: (a) Visually specified tools with automatic code generation to determine a design structure as well as fixing various constraints a design has to obey. These tools are also valuable for design knowledge specialist. (b) Extensions of existing CAD tools to provide semantical knowledge to be used by an architect. It is sketched how these different tools can be combined in the future. The main part of the paper discusses the concepts and realization of two prototypes following the two above approaches. The paper especially discusses that specific graphs and the specification of their structure are useful for both tool realization projects.
ITCE-2003 - 4th Joint Symposium on Information Technology in Civil Engineering ed Flood, I., Seite 1-12, ASCE (CD-ROM), Nashville, USA In this paper we discussed graph based tools to support architects during the conceptual design phase. Conceptual Design is defined before constructive design; the used concepts are more abstract. We develop two graph based approaches, a topdown using the graph rewriting system PROGRES and a more industrially oriented approach, where we extend the CAD system ArchiCAD. In both approaches, knowledge can be defined by a knowledge engineer, in the top-down approach in the domain model graph, in the bottom-up approach in the in an XML file. The defined knowledge is used to incrementally check the sketch and to inform the architect about violations of the defined knowledge. Our goal is to discover design error as soon as possible and to support the architect to design buildings with consideration of conceptual knowledge.
In: Proc. of the 11th Intl. Conf. on Computing in Civil and Building Engineering (ICCCBE-XI) ed. Hugues Rivard, Montreal, Canada, Seite 1-12, ACSE (CD-ROM), 2006 Currently, the conceptual design phase is not adequately supported by any CAD tool. Neither the support while elaborating conceptual sketches, nor the automatic proof of correctness with respect to effective restrictions is currently provided by any commercial tool. To enable domain experts to store the common as well as their personal domain knowledge, we develop a visual language for knowledge formalization. In this paper, a major extension to the already existing concepts is introduced. The possibility to define rule dependencies extends the expressiveness of the knowledge definition language and contributes to the usability of our approach.
In: Computer Aided Architectural Design Futures 2005 2005, Part 4, 207-216, DOI: http://dx.doi.org/10.1007/1-4020-3698-1_19 The conceptual design at the beginning of the building construction process is essential for the success of a building project. Even if some CAD tools allow elaborating conceptual sketches, they rather focus on the shape of the building elements and not on their functionality. We introduce semantic roomobjects and roomlinks, by way of example to the CAD tool ArchiCAD. These extensions provide a basis for specifying the organisation and functionality of a building and free architects being forced to directly produce detailed constructive sketches. Furthermore, we introduce consistency analyses of the conceptual sketch, based on an ontology containing conceptual relevant knowledge, specific to one class of buildings.
In: Net-distributed Co-operation : Xth International Conference on Computing in Civil and Building Engineering, Weimar, June 02 - 04, 2004 ; proceedings / [ed. by Karl Beuke ...] . - Weimar: Bauhaus-Univ. Weimar 2004. - 1. Aufl. . Seite 1-14 ISBN 3-86068-213-X International Conference on Computing in Civil and Building Engineering <10, 2004, Weimar> Summary In our project, we develop new tools for the conceptual design phase. During conceptual design, the coarse functionality and organization of a building is more important than a detailed worked out construction. We identify two roles, first the knowledge engineer who is responsible for knowledge definition and maintenance; second the architect who elaborates the conceptual de-sign. The tool for the knowledge engineer is based on graph technology, it is specified using PROGRES and the UPGRADE framework. The tools for the architect are integrated to the in-dustrial CAD tool ArchiCAD. Consistency between knowledge and conceptual design is en-sured by the constraint checker, another extension to ArchiCAD.
Proc. of the 2005 ASCE Intl. Conf. on Computing in Civil Engineering (ICCC 2005) eds. L. Soibelman und F. Pena-Mora, Seite 1-14, ASCE (CD-ROM), Cancun, Mexico, 2005 Current CAD tools are not able to support the fundamental conceptual design phase, and none of them provides consistency analyses of sketches produced by architects. To give architects a greater support at the conceptual design phase, we develop a CAD tool for conceptual design and a knowledge specification tool allowing the definition of conceptually relevant knowledge. The knowledge is specific to one class of buildings and can be reused. Based on a dynamic knowledge model, different types of design rules formalize the knowledge in a graph-based realization. An expressive visual language provides a user-friendly, human readable representation. Finally, consistency analyses enable conceptual designs to be checked against this defined knowledge. In this paper we concentrate on the knowledge specification part of our project.
An array of 50 MHz quartz microbalances (QMBs) coated with a dendronized polymer was used to detect small amounts of volatile organic compounds (VOCs) in the gas phase. The results were compared to those obtained with the commonly used 10 MHz QMBs. The 50 MHz QMBs proved to be a powerful tool for the detection of VOCs in the gas phase; therefore, they represent a promising alternative to the much more delicate surface acoustic wave devices (SAWs).
In energy economy forecasts of different time series are rudimentary. In this study, a prediction for the German day-ahead spot market is created with Apache Spark and R. It is just an example for many different applications in virtual power plant environments. Other examples of use as intraday price processes, load processes of machines or electric vehicles, real time energy loads of photovoltaic systems and many more time series need to be analysed and predicted.
This work gives a short introduction into the project where this study is settled. It describes the time series methods that are used in energy industry for forecasts shortly. As programming technique Apache Spark, which is a strong cluster computing technology, is utilised. Today, single time series can be predicted. The focus of this work is on developing a method to parallel forecasting, to process multiple time series simultaneously with R and Apache Spark.
This paper reports a first microbial biosensor for rapid and cost-effective determination of organophosphorus pesticides fenitrothion and EPN. The biosensor consisted of recombinant PNP-degrading/oxidizing bacteria Pseudomonas putida JS444 anchoring and displaying organophosphorus hydrolase (OPH) on its cell surface as biological sensing element and a dissolved oxygen electrode as the transducer. Surfaceexpressed OPH catalyzed the hydrolysis of fenitrothion and EPN to release 3-methyl-4-nitrophenol and p-nitrophenol, respectively, which were oxidized by the enzymatic machinery of Pseudomonas putida JS444 to carbon dioxide while consuming oxygen, which was measured and correlated to the concentration of organophosphates. Under the optimum operating conditions, the biosensor was able to measure as low as 277 ppb of fenitrothion and 1.6 ppm of EPN without interference from phenolic compounds and other commonly used pesticides such as carbamate pesticides, triazine herbicides and organophosphate pesticides without nitrophenyl substituent. The applicability of the biosensor to lake water was also demonstrated.
Conventional EEG devices cannot be used in everyday life and hence, past decade research has been focused on Ear-EEG for mobile, at-home monitoring for various applications ranging from emotion detection to sleep monitoring. As the area available for electrode contact in the ear is limited, the electrode size and location play a vital role for an Ear-EEG system. In this investigation, we present a quantitative study of ear-electrodes with two electrode sizes at different locations in a wet and dry configuration. Electrode impedance scales inversely with size and ranges from 450 kΩ to 1.29 MΩ for dry and from 22 kΩ to 42 kΩ for wet contact at 10 Hz. For any size, the location in the ear canal with the lowest impedance is ELE (Left Ear Superior), presumably due to increased contact pressure caused by the outer-ear anatomy. The results can be used to optimize signal pickup and SNR for specific applications. We demonstrate this by recording sleep spindles during sleep onset with high quality (5.27 μVrms).
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
Useful market simulations are key to the evaluation of diferent market designs existing of multiple market mechanisms or rules. Yet a simulation framework which has a comparison of diferent market mechanisms in mind was not found. The need to create an objective view on different sets of market rules while investigating meaningful agent strategies concludes that such a simulation framework is needed to advance the research on this subject. An overview of diferent existing market simulation models is given which also shows the research gap and the missing capabilities of those systems. Finally, a methodology is outlined how a novel market simulation which can answer the research questions can be developed.
Market abstraction of energy markets and policies - application in an agent-based modeling toolbox
(2023)
In light of emerging challenges in energy systems, markets are prone to changing dynamics and market design. Simulation models are commonly used to understand the changing dynamics of future electricity markets. However, existing market models were often created with specific use cases in mind, which limits their flexibility and usability. This can impose challenges for using a single model to compare different market designs. This paper introduces a new method of defining market designs for energy market simulations. The proposed concept makes it easy to incorporate different market designs into electricity market models by using relevant parameters derived from analyzing existing simulation tools, morphological categorization and ontologies. These parameters are then used to derive a market abstraction and integrate it into an agent-based simulation framework, allowing for a unified analysis of diverse market designs. Furthermore, we showcase the usability of integrating new types of long-term contracts and over-the-counter trading. To validate this approach, two case studies are demonstrated: a pay-as-clear market and a pay-as-bid long-term market. These examples demonstrate the capabilities of the proposed framework.
Quartz crystal nanobalance (QCN) sensors are considered as powerful masssensitive sensors to determine materials in the sub-nanogram level. In this study, a single piezoelectric quartz crystal nanobalance modified with polystyrene was employed to detect benzene, toluene, ethylbenzene and xylene (BTEX compounds). The frequency shift of the QCN sensor was found to be linear against the BTEX compound concentrations in the range about 1-45 mg l-1. The correlation coefficients for benzene, toluene, ethylbenzene, and xylene were 0.991, 0.9977, 0.9946 and 0.9971, respectively. The principal component analysis was also utilized to process the frequency response data of the single piezoelectric crystal at different times, considering to the different adsorption-desorption dynamics of BTEX compounds. Using principal component analysis, it was found that over 90% of the data variance could still be explained by use of two principal components (PC1 and PC2). Subsequently, the successful identification of benzene and toluene was possible through the principal component analysis of the transient responses of the polystyrene modified QCN sensor. The results showed that the polystyrene-modified QCN had favorable identification and quantification performances for the BTEX compounds.
Detection of Adrenaline Based on Bioelectrocatalytical System to Support Tumor Diagnostic Technology
(2017)
An H2O2 sensor for the application in industrial sterilisation processes has been developed. Therefore, automated sterilisation equipment at laboratory scale has been constructed using parts from industrial sterilisation facilities. In addition, a software tool has been developed for the control of the sterilisation equipment at laboratory scale. First measurements with the developed sensor set-up as part of the sterilisation equipment have been performed and the sensor has been physically characterised by optical microscopy and SEM.
The absence of a general method for endotoxin removal from liquid interfaces gives an opportunity to find new methods and materials to overcome this gap. Activated nanostructured carbon is a promising material that showed good adsorption properties due to its vast pore network and high surface area. The aim of this study is to find the adsorption rates for a carboneous material produced at different temperatures, as well as to reveal possible differences between the performance of the material for each of the adsorbates used during the study (hemoglobin, serum albumin and lipopolysaccharide, LPS).
In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm.
In this paper, methods of sample preparation for potentiometric measurement of phenylalanine are presented. Basing on the spectrophotometric measurements of phenylalanine, the concentrations of reagents of the enzymatic reaction (10 mM L-Phe, 0,4 mM NAD+, 2U L-PheDH) were determined. Then, the absorption spectrum of the reaction product, NADH, was monitored (maximum peak at 340 nm). The results obtained by the spectrophotometric method were compared with the results obtained by the colourimetry, using pH indicators. The above-mentioned two methods will be used as references for potentiometric measurements of phenylalanine concentration.
In this paper, methods of surface modification of different supports, i.e. glass and polymeric beads for enzyme immobilisation are described. The developed method of enzyme immobilisation is based on Schiff’s base formation between the amino groups on the enzyme surface and the aldehyde groups on the chemically modified surface of the supports. The surface of silicon modified by APTS and GOPS with immobilised enzyme was characterised by atomic force microscopy (AFM), time-of-flight secondary ion mass spectroscopy (ToF-SIMS) and infrared spectroscopy (FTIR). The supports with immobilised enzyme (urease) were also tested in combination with microreactors fabricated in silicon and Perspex, operating in a flow-through system. For microreactors filled with urease immobilised on glass beads (Sigma) and on polymeric beads (PAN), a very high and stable signal (pH change) was obtained. The developed method of urease immobilisation can be stated to be very effective.
A new and simple method for nanostructuring using conventional photolithography and layer expansion or pattern-size reduction technique is presented, which can further be applied for the fabrication of different nanostructures and nano-devices. The method is based on the conversion of a photolithographically patterned metal layer to a metal-oxide mask with improved pattern-size resolution using thermal oxidation. With this technique, the pattern size can be scaled down to several nanometer dimensions. The proposed method is experimentally demonstrated by preparing nanostructures with different configurations and layouts, like circles, rectangles, trapezoids, “fluidic-channel”-, “cantilever”- and meander-type structures.
Label-free Electrostatic Detection of DNA Amplification by PCR Using Capacitive Field-effect Devices
(2016)
A capacitive field-effect EIS (electrolyte-insulator-semiconductor) sensor modified with a positively charged weak polyelectrolyte of poly(allylamine hydrochloride) (PAH)/single-stranded probe DNA (ssDNA) bilayer has been used for a label-free electrostatic detection of pathogen-specific DNA amplification via polymerase chain reaction (PCR). The sensor is able to distinguish between positive and negative PCR solutions, to detect the existence of target DNA amplicons in PCR samples and thus, can be used as tool for a quick verification of DNA amplification and the successful PCR process.
Label-free sensing of biomolecules by their intrinsic molecular charge using field-effect devices
(2015)
Functional testing and characterisation of ISFETs on wafer level by means of a micro-droplet cell
(2006)
A wafer-level functionality testing and characterisation system for ISFETs (ionsensitive field-effect transistor) is realised by means of integration of a specifically designed capillary electrochemical micro-droplet cell into a commercial wafer prober-station. The developed system allows the identification and selection of “good” ISFETs at the earliest stage and to avoid expensive bonding, encapsulation and packaging processes for nonfunctioning ISFETs and thus, to decrease costs, which are wasted for bad dies. The developed system is also feasible for wafer-level characterisation of ISFETs in terms of sensitivity, hysteresis and response time. Additionally, the system might be also utilised for wafer-level testing of further electrochemical sensors.
We compare four different algorithms for automatically estimating the muscle fascicle angle from ultrasonic images: the vesselness filter, the Radon transform, the projection profile method and the gray level cooccurence matrix (GLCM). The algorithm results are compared to ground truth data generated by three different experts on 425 image frames from two videos recorded during different types of motion. The best agreement with the ground truth data was achieved by a combination of pre-processing with a vesselness filter and measuring the angle with the projection profile method. The robustness of the estimation is increased by applying the algorithms to subregions with high gradients and performing a LOESS fit through these estimates.
A concept for a sensitive micro total analysis system for high throughput fluorescence imaging
(2006)
This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis systems (µTAS). The first method relates to side illumination of the fluorescent material placed into microcompartments of the lab-on-chip. Its significance is in high utilization of excitation energy for low concentration of fluorescent material. The utilization of a transparent µLED chip, for the second method, allows the placement of the excitation light sources on the same optical axis with emission detector, such that the excitation and emission rays are directed controversly. The third method presents a spatial filtering of the excitation background.
Human induced pluripotent stem cells (hiPSCs) have shown to be promising in disease studies and drug screenings [1]. Cardiomyocytes derived from hiPSCs have been extensively investigated using patch-clamping and optical methods to compare their electromechanical behaviour relative to fully matured adult cells. Mathematical models can be used for translating findings on hiPSCCMs to adult cells [2] or to better understand the mechanisms of various ion channels when a drug is applied [3,4]. Paci et al. (2013) [3] developed the first model of hiPSC-CMs, which they later refined based on new data [3]. The model is based on iCells® (Fujifilm Cellular Dynamics, Inc. (FCDI), Madison WI, USA) but major differences among several cell lines and even within a single cell line have been found and motivate an approach for creating sample-specific models. We have developed an optimisation algorithm that parameterises the conductances (in S/F=Siemens/Farad) of the latest Paci et al. model (2018) [5] using current-voltage data obtained in individual patch-clamp experiments derived from an automated patch clamp system (Patchliner, Nanion Technologies GmbH, Munich).
A solid-state amperometric hydrogen sensor based on a protonated Nafion membrane and catalytic active electrode operating at room temperature was fabricated and tested. Ionic conducting polymer-metal electrode interfaces were prepared chemically by using the impregnation-reduction method. The polymer membrane was impregnated with tetra-ammine platinum chloride hydrate and the metal ions were subsequently reduced by using either sodium tetrahydroborate or potassium tetrahydroborate. The hydrogen sensing characteristics with air as reference gas is reported. The sensors were capable of detecting hydrogen concentrations from 10 ppm to 10% in nitrogen. The response time was in the range of 10-30 s and a stable linear current output was observed. The thin Pt films were characterized by XRD, Infrared Spectroscopy, Optical Microscopy, Atomic Force Microscopy, Scanning Electron Microscopy and EDAX.
The integration of product data from heterogeneous sources and manufacturers into a single catalog is often still a laborious, manual task. Especially small- and medium-sized enterprises face the challenge of timely integrating the data their business relies on to have an up-to-date product catalog, due to format specifications, low quality of data and the requirement of expert knowledge. Additionally, modern approaches to simplify catalog integration demand experience in machine learning, word vectorization, or semantic similarity that such enterprises do not have. Furthermore, most approaches struggle with low-quality data. We propose Attribute Label Ranking (ALR), an easy to understand and simple to adapt learning approach. ALR leverages a model trained on real-world integration data to identify the best possible schema mapping of previously unknown, proprietary, tabular format into a standardized catalog schema. Our approach predicts multiple labels for every attribute of an inpu t column. The whole column is taken into consideration to rank among these labels. We evaluate ALR regarding the correctness of predictions and compare the results on real-world data to state-of-the-art approaches. Additionally, we report findings during experiments and limitations of our approach.
The integration of frequently changing, volatile product data from different manufacturers into a single catalog is a significant challenge for small and medium-sized e-commerce companies. They rely on timely integrating product data to present them aggregated in an online shop without knowing format specifications, concept understanding of manufacturers, and data quality. Furthermore, format, concepts, and data quality may change at any time. Consequently, integrating product catalogs into a single standardized catalog is often a laborious manual task. Current strategies to streamline or automate catalog integration use techniques based on machine learning, word vectorization, or semantic similarity. However, most approaches struggle with low-quality or real-world data. We propose Attribute Label Ranking (ALR) as a recommendation engine to simplify the integration process of previously unknown, proprietary tabular format into a standardized catalog for practitioners. We evaluate ALR by focusing on the impact of different neural network architectures, language features, and semantic similarity. Additionally, we consider metrics for industrial application and present the impact of ALR in production and its limitations.
Pulmonary arterial cannulation is a common and effective method for percutaneous mechanical circulatory support for concurrent right heart and respiratory failure [1]. However, limited data exists to what effect the positioning of the cannula has on the oxygen perfusion throughout the pulmonary artery (PA). This study aims to evaluate, using computational fluid dynamics (CFD), the effect of different cannula positions in the PA with respect to the oxygenation of the different branching vessels in order for an optimal cannula position to be determined. The four chosen different positions (see Fig. 1) of the cannulas are, in the lower part of the main pulmonary artery (MPA), in the MPA at the junction between the right pulmonary artery (RPA) and the left pulmonary artery (LPA), in the RPA at the first branch of the RPA and in the LPA at the first branch of the LPA.
This paper presents NLP Lean Programming
framework (NLPf), a new framework
for creating custom natural language processing
(NLP) models and pipelines by utilizing
common software development build systems.
This approach allows developers to train and
integrate domain-specific NLP pipelines into
their applications seamlessly. Additionally,
NLPf provides an annotation tool which improves
the annotation process significantly by
providing a well-designed GUI and sophisticated
way of using input devices. Due to
NLPf’s properties developers and domain experts
are able to build domain-specific NLP
applications more efficiently. NLPf is Opensource
software and available at https://
gitlab.com/schrieveslaach/NLPf.
Research collaborations provide opportunities for both practitioners and researchers: practitioners need solutions for difficult business challenges and researchers are looking for hard problems to solve and publish. Nevertheless, research collaborations carry the risk that practitioners focus on quick solutions too much and that researchers tackle theoretical problems, resulting in products which do not fulfill the project requirements.
In this paper we introduce an approach extending the ideas of agile and lean software development. It helps practitioners and researchers keep track of their common research collaboration goal: a scientifically enriched software product which fulfills the needs of the practitioner’s business model.
This approach gives first-class status to application-oriented metrics that measure progress and success of a research collaboration continuously. Those metrics are derived from the collaboration requirements and help to focus on a commonly defined goal.
An appropriate tool set evaluates and visualizes those metrics with minimal effort, and all participants will be pushed to focus on their tasks with appropriate effort. Thus project status, challenges and progress are transparent to all research collaboration members at any time.
Proceedings of the 2nd Humboldt Kolleg, Hammamet, Tunisia Organizer: Alexander von Humboldt Stiftung, Germany. pdf 184 p. Welcome Address Dear Participants, Welcome to the 2nd Humboldt Kolleg in “Nanoscale Science and Technology” (NS&T’12) in Tunisia, sponsored by the "Alexander von Humboldt" foundation. The NS&T’12 multidisciplinary scientific program includes seven "hot" topics dealing with "Nanoscale Science and Technology" covering basic and application-oriented research as well as industrial (market) aspects: - Molecular Biophyics, Spectroscopy Techniques, Imaging Microscopy - Nanomaterials Synthesis for Medicine and Bio-chemical Sensors - Nanostructures, Semiconductors, Photonics and Nanodevices - New Technologies in Market Industry - Environment, Electro-chemistry, Bio-polymers and Fuel Cells - Nanomaterials, Photovoltaic, Modelling, Quantum Physics - Microelectronics, Sensors Networks and Embedded Systems We are deeply indebted to all members of the Scientific Committee and General Chairs for joint Sessions and to all speakers and chairmen, who have dedicated invaluable time and efforts for the realization of this event. On behalf of the Organizing Committee, we are cordially inviting you to join the conference and hope that your stay will be fruitful, rewarding and enjoyable. Prof. Dr. Michael J. Schöning, Prof. Dr. Adnane Abdelghani
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
Magnetic nanoparticles (MNP) are investigated with great interest for biomedical applications in diagnostics (e.g. imaging: magnetic particle imaging (MPI)), therapeutics (e.g. hyperthermia: magnetic fluid hyperthermia (MFH)) and multi-purpose biosensing (e.g. magnetic immunoassays (MIA)). What all of these applications have in common is that they are based on the unique magnetic relaxation mechanisms of MNP in an alternating magnetic field (AMF). While MFH and MPI are currently the most prominent examples of biomedical applications, here we present results on the relatively new biosensing application of frequency mixing magnetic detection (FMMD) from a simulation perspective. In general, we ask how the key parameters of MNP (core size and magnetic anisotropy) affect the FMMD signal: by varying the core size, we investigate the effect of the magnetic volume per MNP; and by changing the effective magnetic anisotropy, we study the MNPs’ flexibility to leave its preferred magnetization direction. From this, we predict the most effective combination of MNP core size and magnetic anisotropy for maximum signal generation.
Micromachined thermal heater platforms offer low electrical power consumption and high modulation speed, i.e. properties which are advantageous for realizing nondispersive infrared (NDIR) gas- and liquid monitoring systems. In this paper, we report on investigations on silicon-on-insulator (SOI) based infrared (IR) emitter devices heated by employing different kinds of metallic and semiconductor heater materials. Our results clearly reveal the superior high-temperature performance of semiconductor over metallic heater materials. Long-term stable emitter operation in the vicinity of 1300 K could be attained using heavily antimony-doped tin dioxide (SnO2:Sb) heater elements.
Study of swift heavy ion modified conduction polymer composites for application as gas sensor
(2006)
A polyaniline-based conducting composite was prepared by oxidative polymerisation of aniline in a polyvinylchloride (PVC) matrix. The coherent free standing thin films of the composite were prepared by a solution casting method. The polyvinyl chloride-polyaniline composites exposed to 120 MeV ions of silicon with total ion fluence ranging from 1011 to 1013 ions/cm2, were observed to be more sensitive towards ammonia gas than the unirradiated composite. The response time of the irradiated composites was observed to be comparably shorter. We report for the first time the application of swift heavy ion modified insulating polymer conducting polymer (IPCP) composites for sensing of ammonia gas.
This paper presents the direct route to Design by Analysis (DBA) of the new European pressure vessel standard in the language of limit and shakedown analysis (LISA). This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for some examples from the DBA-Manual. One observation from the examples is, that the optimisation approach gives reliable and close lower bound solutions leading to simple and optimised design decision.
In: Technical feasibility and reliability of passive safety systems for nuclear power plants. Proceedings of an Advisory Group Meeting held in Jülich, 21-24 November 1994. - Vienna , 1996. - Seite: 43 - 55 IAEA-TECDOC-920 Abstract: It is shown that the difficulty for probabilistic fracture mechanics (PFM) is the general problem of the high reliability of a small population. There is no way around the problem as yet. Therefore what PFM can contribute to the reliability of steel pressure boundaries is demonstrated with the example of a typical reactor pressure vessel and critically discussed. Although no method is distinguishable that could give exact failure probabilities, PFM has several additional chances. Upper limits for failure probability may be obtained together with trends for design and operating conditions. Further, PFM can identify the most sensitive parameters, improved control of which would increase reliability. Thus PFM should play a vital role in the analysis of steel pressure boundaries despite all shortcomings.
The nonlinear scalar constitutive equations of gases lead to a change in sound speed from point to point as would be found in linear inhomogeneous (and time dependent) media. The nonlinear tensor constitutive equations of solids introduce the additional local effect of solution dependent anisotropy. The speed of a wave passing through a point changes with propagation direction and its rays are inclined to the front. It is an open question whether the widely used operator splitting techniques achieve a dimensional splitting with physically reasonable results for these multi-dimensional problems. May be this is the main reason why the theoretical and numerical investigations of multi-dimensional wave propagation in nonlinear solids are so far behind gas dynamics. We hope to promote the subject a little by a discussion of some fundamental aspects of the solution of the equations of nonlinear elastodynamics. We use methods of characteristics because they only integrate mathematically exact equations which have a direct physical interpretation.
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.
Limit and shakedown theorems are exact theories of classical plasticity for the direct computation of safety factors or of the load carrying capacity under constant and varying loads. Simple versions of limit and shakedown analysis are the basis of all design codes for pressure vessels and pipings. Using Finite Element Methods more realistic modeling can be used for a more rational design. The methods can be extended to yield optimum plastic design. In this paper we present a first implementation in FE of limit and shakedown analyses for perfectly plastic material. Limit and shakedown analyses are done of a pipe–junction and a interaction diagram is calculated. The results are in good correspondence with the analytic solution we give in the appendix.
Safety and reliability of structures may be assessed indirectly by stress distributions. Limit and shakedown theorems are simplified but exact methods of plasticity that provide safety factors directly in the loading space. These theorems may be used for a direct definition of the limit state function for failure by plastic collapse or by inadaptation. In a FEM formulation the limit state function is obtained from a nonlinear optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error. Moreover, the direct way leads to highly effective and precise reliability analyses. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis.
Structural design analyses are conducted with the aim of verifying the exclusion of ratcheting. To this end it is important to make a clear distinction between the shakedown range and the ratcheting range. In cyclic plasticity more sophisticated hardening models have been suggested in order to model the strain evolution observed in ratcheting experiments. The hardening models used in shakedown analysis are comparatively simple. It is shown that shakedown analysis can make quite stable predictions of admissible load ranges despite the simplicity of the underlying hardening models. A linear and a nonlinear kinematic hardening model of two-surface plasticity are compared in material shakedown analysis. Both give identical or similar shakedown ranges. Structural shakedown analyses show that the loading may have a more pronounced effect than the hardening model.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
The interest in PET detectors with monolithic block scintillators is growing. In order to obtain high spatial resolutions dedicated positioning algorithms are required. But even an ideal algorithm can only deliver information which is provided by the detector. In this simulation study we investigated the light distribution on one surface of cuboid LSO scintillators of different size. Scintillators with a large aspect ratio (small footprint and large height) showed significant position information only for a minimum interaction depth of the gamma particle. The results allow a quantitative estimate for a useful aspect ratio.
Design and implementation aspects of a 3D reconstruction algorithm for the Jülich TierPET system
(1997)
We propose a stochastic programming method to analyse limit and shakedown of structures under random strength with lognormal distribution. In this investigation a dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit or the shakedown limit. The edge-based smoothed finite element method (ES-FEM) using three-node linear triangular elements is used.
A new formulation to calculate the shakedown limit load of Kirchhoff plates under stochastic conditions of strength is developed. Direct structural reliability design by chance con-strained programming is based on the prescribed failure probabilities, which is an effective approach of stochastic programming if it can be formulated as an equivalent deterministic optimization problem. We restrict uncertainty to strength, the loading is still deterministic. A new formulation is derived in case of random strength with lognormal distribution. Upper bound and lower bound shakedown load factors are calculated simultaneously by a dual algorithm.
Direct methods comprising limit and shakedown analysis is a branch of computational mechanics. It plays a significant role in mechanical and civil engineering design. The concept of direct method aims to determinate the ultimate load bearing capacity of structures beyond the elastic range. For practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and onstraints. If strength and loading are random quantities, the problem of shakedown analysis is considered as stochastic programming. This paper presents a method so called chance constrained programming, an effective method of stochastic programming, to solve shakedown analysis problem under random condition of strength. In this our investigation, the loading is deterministic, the strength is distributed as normal or lognormal variables.
Summary: This paper presents a methodology to study and understand the mechanics of stapled anastomotic behaviors by combining empirical experimentation and finite element analysis. Performance of stapled anastomosis is studied in terms of leakage and numerical results which are compared to in vitro experiments performed on fresh porcine tissue. Results suggest that leaks occur between the tissue and staple legs penetrating through the tissue.