Conference Proceeding
Refine
Year of publication
- 2024 (5)
- 2023 (4)
- 2022 (8)
- 2021 (7)
- 2020 (4)
- 2019 (10)
- 2018 (7)
- 2017 (16)
- 2016 (13)
- 2015 (14)
- 2014 (14)
- 2013 (6)
- 2012 (18)
- 2011 (3)
- 2010 (5)
- 2009 (7)
- 2008 (4)
- 2007 (4)
- 2006 (30)
- 2005 (2)
- 2004 (4)
- 2003 (9)
- 2002 (2)
- 2001 (1)
- 2000 (2)
- 1998 (1)
- 1997 (2)
- 1996 (1)
- 1995 (1)
- 1989 (1)
- 1985 (1)
- 1981 (1)
- 1980 (1)
- 1978 (2)
- 1975 (1)
- 1973 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (212) (remove)
Language
- English (212) (remove)
Document Type
- Conference Proceeding (212) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Clusterion (4)
- Limit analysis (4)
- Natural language processing (4)
- Air purification (3)
- Einspielen <Werkstoff> (3)
This paper presents NLP Lean Programming
framework (NLPf), a new framework
for creating custom natural language processing
(NLP) models and pipelines by utilizing
common software development build systems.
This approach allows developers to train and
integrate domain-specific NLP pipelines into
their applications seamlessly. Additionally,
NLPf provides an annotation tool which improves
the annotation process significantly by
providing a well-designed GUI and sophisticated
way of using input devices. Due to
NLPf’s properties developers and domain experts
are able to build domain-specific NLP
applications more efficiently. NLPf is Opensource
software and available at https://
gitlab.com/schrieveslaach/NLPf.
Research collaborations provide opportunities for both practitioners and researchers: practitioners need solutions for difficult business challenges and researchers are looking for hard problems to solve and publish. Nevertheless, research collaborations carry the risk that practitioners focus on quick solutions too much and that researchers tackle theoretical problems, resulting in products which do not fulfill the project requirements.
In this paper we introduce an approach extending the ideas of agile and lean software development. It helps practitioners and researchers keep track of their common research collaboration goal: a scientifically enriched software product which fulfills the needs of the practitioner’s business model.
This approach gives first-class status to application-oriented metrics that measure progress and success of a research collaboration continuously. Those metrics are derived from the collaboration requirements and help to focus on a commonly defined goal.
An appropriate tool set evaluates and visualizes those metrics with minimal effort, and all participants will be pushed to focus on their tasks with appropriate effort. Thus project status, challenges and progress are transparent to all research collaboration members at any time.
Pulmonary arterial cannulation is a common and effective method for percutaneous mechanical circulatory support for concurrent right heart and respiratory failure [1]. However, limited data exists to what effect the positioning of the cannula has on the oxygen perfusion throughout the pulmonary artery (PA). This study aims to evaluate, using computational fluid dynamics (CFD), the effect of different cannula positions in the PA with respect to the oxygenation of the different branching vessels in order for an optimal cannula position to be determined. The four chosen different positions (see Fig. 1) of the cannulas are, in the lower part of the main pulmonary artery (MPA), in the MPA at the junction between the right pulmonary artery (RPA) and the left pulmonary artery (LPA), in the RPA at the first branch of the RPA and in the LPA at the first branch of the LPA.
The integration of product data from heterogeneous sources and manufacturers into a single catalog is often still a laborious, manual task. Especially small- and medium-sized enterprises face the challenge of timely integrating the data their business relies on to have an up-to-date product catalog, due to format specifications, low quality of data and the requirement of expert knowledge. Additionally, modern approaches to simplify catalog integration demand experience in machine learning, word vectorization, or semantic similarity that such enterprises do not have. Furthermore, most approaches struggle with low-quality data. We propose Attribute Label Ranking (ALR), an easy to understand and simple to adapt learning approach. ALR leverages a model trained on real-world integration data to identify the best possible schema mapping of previously unknown, proprietary, tabular format into a standardized catalog schema. Our approach predicts multiple labels for every attribute of an inpu t column. The whole column is taken into consideration to rank among these labels. We evaluate ALR regarding the correctness of predictions and compare the results on real-world data to state-of-the-art approaches. Additionally, we report findings during experiments and limitations of our approach.
The integration of frequently changing, volatile product data from different manufacturers into a single catalog is a significant challenge for small and medium-sized e-commerce companies. They rely on timely integrating product data to present them aggregated in an online shop without knowing format specifications, concept understanding of manufacturers, and data quality. Furthermore, format, concepts, and data quality may change at any time. Consequently, integrating product catalogs into a single standardized catalog is often a laborious manual task. Current strategies to streamline or automate catalog integration use techniques based on machine learning, word vectorization, or semantic similarity. However, most approaches struggle with low-quality or real-world data. We propose Attribute Label Ranking (ALR) as a recommendation engine to simplify the integration process of previously unknown, proprietary tabular format into a standardized catalog for practitioners. We evaluate ALR by focusing on the impact of different neural network architectures, language features, and semantic similarity. Additionally, we consider metrics for industrial application and present the impact of ALR in production and its limitations.
A solid-state amperometric hydrogen sensor based on a protonated Nafion membrane and catalytic active electrode operating at room temperature was fabricated and tested. Ionic conducting polymer-metal electrode interfaces were prepared chemically by using the impregnation-reduction method. The polymer membrane was impregnated with tetra-ammine platinum chloride hydrate and the metal ions were subsequently reduced by using either sodium tetrahydroborate or potassium tetrahydroborate. The hydrogen sensing characteristics with air as reference gas is reported. The sensors were capable of detecting hydrogen concentrations from 10 ppm to 10% in nitrogen. The response time was in the range of 10-30 s and a stable linear current output was observed. The thin Pt films were characterized by XRD, Infrared Spectroscopy, Optical Microscopy, Atomic Force Microscopy, Scanning Electron Microscopy and EDAX.
Human induced pluripotent stem cells (hiPSCs) have shown to be promising in disease studies and drug screenings [1]. Cardiomyocytes derived from hiPSCs have been extensively investigated using patch-clamping and optical methods to compare their electromechanical behaviour relative to fully matured adult cells. Mathematical models can be used for translating findings on hiPSCCMs to adult cells [2] or to better understand the mechanisms of various ion channels when a drug is applied [3,4]. Paci et al. (2013) [3] developed the first model of hiPSC-CMs, which they later refined based on new data [3]. The model is based on iCells® (Fujifilm Cellular Dynamics, Inc. (FCDI), Madison WI, USA) but major differences among several cell lines and even within a single cell line have been found and motivate an approach for creating sample-specific models. We have developed an optimisation algorithm that parameterises the conductances (in S/F=Siemens/Farad) of the latest Paci et al. model (2018) [5] using current-voltage data obtained in individual patch-clamp experiments derived from an automated patch clamp system (Patchliner, Nanion Technologies GmbH, Munich).
A concept for a sensitive micro total analysis system for high throughput fluorescence imaging
(2006)
This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis systems (µTAS). The first method relates to side illumination of the fluorescent material placed into microcompartments of the lab-on-chip. Its significance is in high utilization of excitation energy for low concentration of fluorescent material. The utilization of a transparent µLED chip, for the second method, allows the placement of the excitation light sources on the same optical axis with emission detector, such that the excitation and emission rays are directed controversly. The third method presents a spatial filtering of the excitation background.
We compare four different algorithms for automatically estimating the muscle fascicle angle from ultrasonic images: the vesselness filter, the Radon transform, the projection profile method and the gray level cooccurence matrix (GLCM). The algorithm results are compared to ground truth data generated by three different experts on 425 image frames from two videos recorded during different types of motion. The best agreement with the ground truth data was achieved by a combination of pre-processing with a vesselness filter and measuring the angle with the projection profile method. The robustness of the estimation is increased by applying the algorithms to subregions with high gradients and performing a LOESS fit through these estimates.
Functional testing and characterisation of ISFETs on wafer level by means of a micro-droplet cell
(2006)
A wafer-level functionality testing and characterisation system for ISFETs (ionsensitive field-effect transistor) is realised by means of integration of a specifically designed capillary electrochemical micro-droplet cell into a commercial wafer prober-station. The developed system allows the identification and selection of “good” ISFETs at the earliest stage and to avoid expensive bonding, encapsulation and packaging processes for nonfunctioning ISFETs and thus, to decrease costs, which are wasted for bad dies. The developed system is also feasible for wafer-level characterisation of ISFETs in terms of sensitivity, hysteresis and response time. Additionally, the system might be also utilised for wafer-level testing of further electrochemical sensors.
Label-free sensing of biomolecules by their intrinsic molecular charge using field-effect devices
(2015)
Label-free Electrostatic Detection of DNA Amplification by PCR Using Capacitive Field-effect Devices
(2016)
A capacitive field-effect EIS (electrolyte-insulator-semiconductor) sensor modified with a positively charged weak polyelectrolyte of poly(allylamine hydrochloride) (PAH)/single-stranded probe DNA (ssDNA) bilayer has been used for a label-free electrostatic detection of pathogen-specific DNA amplification via polymerase chain reaction (PCR). The sensor is able to distinguish between positive and negative PCR solutions, to detect the existence of target DNA amplicons in PCR samples and thus, can be used as tool for a quick verification of DNA amplification and the successful PCR process.
A new and simple method for nanostructuring using conventional photolithography and layer expansion or pattern-size reduction technique is presented, which can further be applied for the fabrication of different nanostructures and nano-devices. The method is based on the conversion of a photolithographically patterned metal layer to a metal-oxide mask with improved pattern-size resolution using thermal oxidation. With this technique, the pattern size can be scaled down to several nanometer dimensions. The proposed method is experimentally demonstrated by preparing nanostructures with different configurations and layouts, like circles, rectangles, trapezoids, “fluidic-channel”-, “cantilever”- and meander-type structures.
In this paper, methods of surface modification of different supports, i.e. glass and polymeric beads for enzyme immobilisation are described. The developed method of enzyme immobilisation is based on Schiff’s base formation between the amino groups on the enzyme surface and the aldehyde groups on the chemically modified surface of the supports. The surface of silicon modified by APTS and GOPS with immobilised enzyme was characterised by atomic force microscopy (AFM), time-of-flight secondary ion mass spectroscopy (ToF-SIMS) and infrared spectroscopy (FTIR). The supports with immobilised enzyme (urease) were also tested in combination with microreactors fabricated in silicon and Perspex, operating in a flow-through system. For microreactors filled with urease immobilised on glass beads (Sigma) and on polymeric beads (PAN), a very high and stable signal (pH change) was obtained. The developed method of urease immobilisation can be stated to be very effective.
In this paper, methods of sample preparation for potentiometric measurement of phenylalanine are presented. Basing on the spectrophotometric measurements of phenylalanine, the concentrations of reagents of the enzymatic reaction (10 mM L-Phe, 0,4 mM NAD+, 2U L-PheDH) were determined. Then, the absorption spectrum of the reaction product, NADH, was monitored (maximum peak at 340 nm). The results obtained by the spectrophotometric method were compared with the results obtained by the colourimetry, using pH indicators. The above-mentioned two methods will be used as references for potentiometric measurements of phenylalanine concentration.
In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm.
The absence of a general method for endotoxin removal from liquid interfaces gives an opportunity to find new methods and materials to overcome this gap. Activated nanostructured carbon is a promising material that showed good adsorption properties due to its vast pore network and high surface area. The aim of this study is to find the adsorption rates for a carboneous material produced at different temperatures, as well as to reveal possible differences between the performance of the material for each of the adsorbates used during the study (hemoglobin, serum albumin and lipopolysaccharide, LPS).
An H2O2 sensor for the application in industrial sterilisation processes has been developed. Therefore, automated sterilisation equipment at laboratory scale has been constructed using parts from industrial sterilisation facilities. In addition, a software tool has been developed for the control of the sterilisation equipment at laboratory scale. First measurements with the developed sensor set-up as part of the sterilisation equipment have been performed and the sensor has been physically characterised by optical microscopy and SEM.
Detection of Adrenaline Based on Bioelectrocatalytical System to Support Tumor Diagnostic Technology
(2017)
Quartz crystal nanobalance (QCN) sensors are considered as powerful masssensitive sensors to determine materials in the sub-nanogram level. In this study, a single piezoelectric quartz crystal nanobalance modified with polystyrene was employed to detect benzene, toluene, ethylbenzene and xylene (BTEX compounds). The frequency shift of the QCN sensor was found to be linear against the BTEX compound concentrations in the range about 1-45 mg l-1. The correlation coefficients for benzene, toluene, ethylbenzene, and xylene were 0.991, 0.9977, 0.9946 and 0.9971, respectively. The principal component analysis was also utilized to process the frequency response data of the single piezoelectric crystal at different times, considering to the different adsorption-desorption dynamics of BTEX compounds. Using principal component analysis, it was found that over 90% of the data variance could still be explained by use of two principal components (PC1 and PC2). Subsequently, the successful identification of benzene and toluene was possible through the principal component analysis of the transient responses of the polystyrene modified QCN sensor. The results showed that the polystyrene-modified QCN had favorable identification and quantification performances for the BTEX compounds.
In the research domain of energy informatics, the importance of open datais rising rapidly. This can be seen as various new public datasets are created andpublished. Unfortunately, in many cases, the data is not available under a permissivelicense corresponding to the FAIR principles, often lacking accessibility or reusability.Furthermore, the source format often differs from the desired data format or does notmeet the demands to be queried in an efficient way. To solve this on a small scale atoolbox for ETL-processes is provided to create a local energy data server with openaccess data from different valuable sources in a structured format. So while the sourcesitself do not fully comply with the FAIR principles, the provided unique toolbox allows foran efficient processing of the data as if the FAIR principles would be met. The energydata server currently includes information of power systems, weather data, networkfrequency data, European energy and gas data for demand and generation and more.However, a solution to the core problem - missing alignment to the FAIR principles - isstill needed for the National Research Data Infrastructure.