Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (2072) (remove)
Document Type
- Article (1591)
- Conference Proceeding (241)
- Book (96)
- Part of a Book (62)
- Doctoral Thesis (27)
- Patent (17)
- Report (15)
- Other (9)
- Habilitation (4)
- Lecture (3)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (16)
- CAD (15)
- civil engineering (14)
- Bauingenieurwesen (13)
- Einspielen <Werkstoff> (13)
- shakedown analysis (9)
- FEM (6)
- Limit analysis (6)
- Shakedown analysis (6)
Die qualitative und quantitative Detektion von Zielsubstanzen innerhalb einer wässrigen Probe ist für viele Fragestellungen von Interesse, etwa bei der Detektion von Kontaminationen in Trinkwasser in Krisensituationen. Hierbei ist es nicht nur wichtig, dass Pathogene möglichst sensitiv detektiert werden können, sondern auch, dass die Analyse schnell erfolgt, um Betroffenen im Katastrophenfall zügig sicheres Trinkwasser zu Verfügung stellen zu können. Da bei einem solchen Szenario nicht von einer in der Nähe befindlichen funktionierenden Laborinfrastruktur ausgegangen werden kann, ist es wichtig, dass die Messung direkt vor Ort erfolgen kann. Im Rahmen dieser Arbeit wurde untersucht, ob eine derartige Schnellanalytik mithilfe von superparamagnetischen Beads (MBs) und der magnetischen Frequenzmischtechnik möglich ist. Dabei werden die MBs mit Hilfe von primären Antikörpern an die Zielsubstanz gebunden und mit sekundären Antikörpern an die Poren-Oberfläche eines Polyethylen-Filters fixiert (Sandwich-Immunoassay). So kann die Quantifizierung der Zielsubstanz auf eine magnetische Messung der immobilisierten MB-Marker zurückgeführt werden. Die magnetische Frequenzmischtechnik basiert auf der Anregung der Probe mit Magnetfeldern zweier verschiedener Frequenzen. Die durch die nichtlineare Magnetisierungsform der superparamagnetischen MBs entstehenden Mischfrequenzen werden typischerweise mithilfe einer zweistufigen Lock-in-Detektion analysiert (analoge Demodulation), die in einem Magnetreader als Handheldgerät realisiert wurde. Zusätzlich zu dieser Technik wurde das Prinzip der direkten Digitalisierung des gesamten Antwortsignals mit anschließender Fourier-Analyse der erzeugten Mischfrequenzen experimentell umgesetzt, um die Amplituden und Phasen mehrerer Mischfrequenzen simultan zu erfassen. Eine Möglichkeit zur Sensitivitätssteigerung ist die magnetische Aufkonzentration, indem vor der magnetischen Analyse eine Separation der MBs aus einem größeren Probenvolumen mittels magnetischem Feldgradienten durchgeführt wird. Zur Charakterisierung verschiedener kommerzieller MBs hinsichtlich ihrer magnetischen Separierbarkeit wurde ein Aufbau zur Messung ihrer magnetophoretischen Beweglichkeiten realisiert und ihre Geschwindigkeiten im Gradientenfeld mikroskopisch gemessen.Da eine Probe oftmals nicht nur auf eine einzige Zielsubstanz, sondern simultan auf mehrere verschiedene Pathogene hin untersucht werden soll, wurden verschiedene Ansätze entwickelt und getestet, die einen solchen multiparametrischen magnetischen Immunoassay ermöglichen. Einerseits wurde eine räumliche Separation der Bindungsbereiche für verschiedene Zielsubstanzen realisiert, die sequentiell ausgewertet werden können. Andererseits wurde die Unterscheidung von verschiedenen Zielsubstanzen anhand der Charakteristika der an sie gebundenen, verschieden funktionalisierten MB-Typen untersucht. Für eine solche Unterscheidung wurde zum einen die Anregefrequenz der magnetischen Frequenzmischtechnik während einer Messung variiert. Damit konnte gezeigt werden, dass sich verschiedene MB-Sorten anhand der Phase ihrer Frequenzmischsignale voneinander unterscheiden lassen. Weiterhin wurde gezeigt, dass sich der Signalverlauf einer binären Mischung zweier verschiedener MB-Typen als gradueller Übergang der Verläufe der beiden reinen MB-Lösungen ergibt. Eine weitere Analysemethode für einen multiparametrischen Immunoassay besteht darin, ein zusätzliches einstellbares statisches magnetisches Offsetfeld zu verwenden. Hierfür wurden mehrere Aufbauten auf Basis von Permanent- und Elektromagneten simuliert, konstruiert und charakterisiert. Mithilfe von Simulationen konnte gezeigt werden, dass eine auf diesem Verfahren beruhende Unterscheidung für MBs mit unterschiedlichen magnetischen Partikelmomenten möglich ist. Als direkte Anwendung des hier entwickelten Magnetreaders in Zusammenspiel mit der digitalen Demodulation wurde ein magnetischer Assay gegen die B-Untereinheit des Choleratoxins in Trinkwasser mit einem niedrigen Detektionslimit von 0,2 ng/ml demonstriert.
Extracellular acidification is a basic indicator for alterations in two vital metabolic pathways: glycolysis and cellular respiration. Measuring these alterations by monitoring extracellular acidification using cell-based biosensors such as LAPS plays an important role in studying these pathways whose disorders are associated with numerous diseases including cancer. However, the surface of the biosensors must be specially tailored to ensure high cell compatibility so that cells can represent more in vivo-like behavior, which is critical to gain more realistic in vitro results from the analyses, e.g., drug discovery experiments. In this work, O2 plasma patterning on the LAPS surface is studied to enhance surface features of the sensor chip, e.g., wettability and biofunctionality. The surface treated with O2 plasma for 30 s exhibits enhanced cytocompatibility for adherent CHO–K1 cells, which promotes cell spreading and proliferation. The plasma-modified LAPS chip is then integrated into a microfluidic system, which provides two identical channels to facilitate differential measurements of the extracellular acidification of CHO–K1 cells. To the best of our knowledge, it is the first time that extracellular acidification within microfluidic channels is quantitatively visualized as differential (bio-)chemical images.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
There is a very large number of very important situations which can be modeled with nonlinear parabolic partial differential equations (PDEs) in several dimensions. In general, these PDEs can be solved by discretizing in the spatial variables and transforming them into huge systems of ordinary differential equations (ODEs), which are very stiff. Therefore, standard explicit methods require a large number of iterations to solve stiff problems. But implicit schemes are computationally very expensive when solving huge systems of nonlinear ODEs. Several families of Extrapolated Stabilized Explicit Runge-Kutta schemes (ESERK) with different order of accuracy (3 to 6) are derived and analyzed in this work. They are explicit methods, with stability regions extended, along the negative real semi-axis, quadratically with respect to the number of stages s, hence they can be considered to solve stiff problems much faster than traditional explicit schemes. Additionally, they allow the adaptation of the step length easily with a very small cost.
Two new families of ESERK schemes (ESERK3 and ESERK6) are derived, and analyzed, in this work. Each family has more than 50 new schemes, with up to 84.000 stages in the case of ESERK6. For the first time, we also parallelized all these new variable step length and variable number of stages algorithms (ESERK3, ESERK4, ESERK5, and ESERK6). These parallelized strategies allow to decrease times significantly, as it is discussed and also shown numerically in two problems. Thus, the new codes provide very good results compared to other well-known ODE solvers. Finally, a new strategy is proposed to increase the efficiency of these schemes, and it is discussed the idea of combining ESERK families in one code, because typically, stiff problems have different zones and according to them and the requested tolerance the optimum order of convergence is different.
The Rothman–Woodroofe symmetry test statistic is revisited on the basis of independent but not necessarily identically distributed random variables. The distribution-freeness if the underlying distributions are all symmetric and continuous is obtained. The results are applied for testing symmetry in a meta-analysis random effects model. The consistency of the procedure is discussed in this situation as well. A comparison with an alternative proposal from the literature is conducted via simulations. Real data are analyzed to demonstrate how the new approach works in practice.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
Twee Kanten van één Medaille
(2020)
Elastic transmission eigenvalues and their computation via the method of fundamental solutions
(2020)
A stabilized version of the fundamental solution method to catch ill-conditioning effects is investigated with focus on the computation of complex-valued elastic interior transmission eigenvalues in two dimensions for homogeneous and isotropic media. Its algorithm can be implemented very shortly and adopts to many similar partial differential equation-based eigenproblems as long as the underlying fundamental solution function can be easily generated. We develop a corroborative approximation analysis which also implicates new basic results for transmission eigenfunctions and present some numerical examples which together prove successful feasibility of our eigenvalue recovery approach.
We propose the so-called chance constrained programming model of stochastic programming theory to analyze limit and shakedown loads of structures under random strength with a lognormal distribution. A dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit and the shakedown limit. The edge-based smoothed finite element method (ES-FEM) is used with three-node linear triangular elements.
Electrolyte-insulator-semiconductor (EIS) field-effect sensors belong to a new generation of electronic chips for biochemical sensing, enabling a direct electronic readout. The review gives an overview on recent advances and current trends in the research and development of chemical sensors and biosensors based on the capacitive field-effect EIS structure—the simplest field-effect device, which represents a biochemically sensitive capacitor. Fundamental concepts, physicochemical phenomena underlying the transduction mechanism and application of capacitive EIS sensors for the detection of pH, ion concentrations, and enzymatic reactions, as well as the label-free detection of charged molecules (nucleic acids, proteins, and polyelectrolytes) and nanoparticles, are presented and discussed.
In this article, a concept of implicit methods for scalar conservation laws in one or more spatial dimensions allowing also for source terms of various types is presented. This material is a significant extension of previous work of the first author (Breuß SIAM J. Numer. Anal. 43(3), 970–986 2005). Implicit notions are developed that are centered around a monotonicity criterion. We demonstrate a connection between a numerical scheme and a discrete entropy inequality, which is based on a classical approach by Crandall and Majda. Additionally, three implicit methods are investigated using the developed notions. Next, we conduct a convergence proof which is not based on a classical compactness argument. Finally, the theoretical results are confirmed by various numerical tests.
The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications.
Interior transmission eigenvalue problems for the Helmholtz equation play an important role in inverse wave scattering. Some distribution properties of those eigenvalues in the complex plane are reviewed. Further, a new scattering model for the interior transmission eigenvalue problem with mixed boundary conditions is described and an efficient algorithm for computing the interior transmission eigenvalues is proposed. Finally, extensive numerical results for a variety of two-dimensional scatterers are presented to show the validity of the proposed scheme.
We present new numerical results for shape optimization problems of interior Neumann eigenvalues. This field is not well understood from a theoretical standpoint. The existence of shape maximizers is not proven beyond the first two eigenvalues, so we study the problem numerically. We describe a method to compute the eigenvalues for a given shape that combines the boundary element method with an algorithm for nonlinear eigenvalues. As numerical optimization requires many such evaluations, we put a focus on the efficiency of the method and the implemented routine. The method is well suited for parallelization. Using the resulting fast routines and a specialized parametrization of the shapes, we found improved maxima for several eigenvalues.
Kyphoplasty of Osteoporotic Fractured Vertebrae: A Finite Element Analysis about Two Types of Cement
(2019)
Heating efficiency of magnetic nanoparticles decreases with gradual immobilization in hydrogels
(2019)
Monitoring the cellular metabolism of bacteria in (bio)fermentation processes is crucial to control and steer them, and to prevent undesired disturbances linked to metabolically inactive microorganisms. In this context, cell-based biosensors can play an important role to improve the quality and increase the yield of such processes. This work describes the simultaneous analysis of the metabolic behavior of three different types of bacteria by means of a differential light-addressable potentiometric sensor (LAPS) set-up. The study includes Lactobacillus brevis, Corynebacterium glutamicum, and Escherichia coli, which are often applied in fermentation processes in bioreactors. Differential measurements were carried out to compensate undesirable influences such as sensor signal drift, and pH value variation during the measurements. Furthermore, calibration curves of the cellular metabolism were established as a function of the glucose concentration or cell number variation with all three model microorganisms. In this context, simultaneous (bio)sensing with the multi-organism LAPS-based set-up can open new possibilities for a cost-effective, rapid detection of the extracellular acidification of bacteria on a single sensor chip. It can be applied to evaluate the metabolic response of bacteria populations in a (bio)fermentation process, for instance, in the biogas fermentation process.
Enzyme-catalyzed reactions have been designed to mimic various Boolean logic gates in the general framework of unconventional biomolecular computing. While some of the logic gates, particularly OR, AND, are easy to realize with biocatalytic reactions and have been reported in numerous publications, some other, like NXOR, are very challenging and have not been realized yet with enzyme reactions. The paper reports on a novel approach to mimicking the NXOR logic gate using the bell-shaped enzyme activity dependent on pH values. Shifting pH from the optimum value to the acidic or basic values by using acid or base inputs (meaning 1,0 and 0,1 inputs) inhibits the enzyme reaction, while keeping the optimum pH (assuming 0,0 and 1,1 input combinations) preserves a high enzyme activity. The challenging part of the present approach is the selection of an enzyme with a well-demonstrated bell-shape activity dependence on the pH value. While many enzymes can satisfy this condition, we selected pyrroloquinoline quinone (PQQ)-dependent glucose dehydrogenase as this enzyme has the optimum pH center-located on the pH scale allowing the enzyme activity change by the acidic and basic pH shift from the optimum value corresponding to the highest activity. The present NXOR gate is added to the biomolecular “toolbox” as a new example of Boolean logic gates based on enzyme reactions.
Hydrogen peroxide (H2O2) is a typical surface sterilization agent for packaging materials used in the pharmaceutical, food and beverage industries. We use the finite-elements method to analyze the conceptual design of an in-line thermal evaporation unit to produce a heated gas mixture of air and evaporated H2O2 solution. For the numerical model, the required phase-transition variables of pure H2O2 solution and of the aerosol mixture are acquired from vapor-liquid equilibrium (VLE) diagrams derived from vapor-pressure formulations. This work combines homogeneous single-phase turbulent flow with heat-transfer physics to describe the operation of the evaporation unit. We introduce the apparent heat-capacity concept to approximate the non-isothermal phase-transition process of the H2O2-containing aerosol. Empirical and analytical functions are defined to represent the temperature- and pressure-dependent material properties of the aqueous H2O2 solution, the aerosol and the gas mixture. To validate the numerical model, the simulation results are compared to experimental data on the heating power required to produce the gas mixture. This shows good agreement with the deviations below 10%. Experimental observations on the formation of deposits due to the evaporation of stabilized H2O2 solution fits the prediction made from simulation results.