Refine
Year of publication
- 2024 (15)
- 2023 (38)
- 2022 (48)
- 2021 (55)
- 2020 (52)
- 2019 (69)
- 2018 (67)
- 2017 (67)
- 2016 (54)
- 2015 (70)
- 2014 (65)
- 2013 (65)
- 2012 (72)
- 2011 (82)
- 2010 (72)
- 2009 (85)
- 2008 (61)
- 2007 (57)
- 2006 (75)
- 2005 (48)
- 2004 (85)
- 2003 (57)
- 2002 (55)
- 2001 (54)
- 2000 (65)
- 1999 (40)
- 1998 (39)
- 1997 (36)
- 1996 (32)
- 1995 (19)
- 1994 (13)
- 1993 (19)
- 1992 (13)
- 1991 (12)
- 1990 (17)
- 1989 (21)
- 1988 (22)
- 1987 (26)
- 1986 (7)
- 1985 (10)
- 1984 (9)
- 1983 (6)
- 1982 (24)
- 1981 (16)
- 1980 (30)
- 1979 (20)
- 1978 (27)
- 1977 (13)
- 1976 (16)
- 1975 (14)
- 1974 (4)
- 1973 (3)
- 1972 (6)
- 1971 (1)
- 1969 (1)
- 1968 (2)
- 1967 (1)
Document Type
- Article (1578)
- Conference Proceeding (239)
- Book (96)
- Part of a Book (59)
- Doctoral Thesis (27)
- Patent (17)
- Report (15)
- Other (8)
- Habilitation (4)
- Lecture (3)
- Preprint (3)
- Course Material (1)
- Review (1)
- Talk (1)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (16)
- CAD (15)
- civil engineering (14)
- Bauingenieurwesen (13)
- Einspielen <Werkstoff> (13)
- shakedown analysis (9)
- FEM (6)
- Limit analysis (6)
- Shakedown analysis (6)
- limit analysis (6)
- Clusterion (5)
- Air purification (4)
- Einspielanalyse (4)
- Hämoglobin (4)
- LAPS (4)
- Lipopolysaccharide (4)
- Luftreiniger (4)
- Natural language processing (4)
- Plasmacluster ion technology (4)
Institute
- Fachbereich Medizintechnik und Technomathematik (2052) (remove)
Suppose we have k samples X₁,₁,…,X₁,ₙ₁,…,Xₖ,₁,…,Xₖ,ₙₖ with different sample sizes ₙ₁,…,ₙₖ and unknown underlying distribution functions F₁,…,Fₖ as observations plus k families of distribution functions {G₁(⋅,ϑ);ϑ∈Θ},…,{Gₖ(⋅,ϑ);ϑ∈Θ}, each indexed by elements ϑ from the same parameter set Θ, we consider the new goodness-of-fit problem whether or not (F₁,…,Fₖ) belongs to the parametric family {(G₁(⋅,ϑ),…,Gₖ(⋅,ϑ));ϑ∈Θ}. New test statistics are presented and a parametric bootstrap procedure for the approximation of the unknown null distributions is discussed. Under regularity assumptions, it is proved that the approximation works asymptotically, and the limiting distributions of the test statistics in the null hypothesis case are determined. Simulation studies investigate the quality of the new approach for small and moderate sample sizes. Applications to real-data sets illustrate how the idea can be used for verifying model assumptions.
Selected problems in the field of multivariate statistical analysis are treated. Thereby, one focus is on the paired sample case. Among other things, statistical testing problems of marginal homogeneity are under consideration. In detail, properties of Hotelling‘s T² test in a special parametric situation are obtained. Moreover, the nonparametric problem of marginal homogeneity is discussed on the basis of possibly incomplete data. In the bivariate data case, properties of the Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic on the basis of partly not identically distributed data are investigated. Similar testing problems are treated within the scope of the application of a result for the empirical process of the concomitants for partly categorial data. Furthermore, testing changes in the modeled solvency capital requirement of an insurance company by means of a paired sample from an internal risk model is discussed. Beyond the paired sample case, a new asymptotic relative efficiency concept based on the expected volumes of multidimensional confidence regions is introduced. Besides, a new approach for the treatment of the multi-sample goodness-of-fit problem is presented. Finally, a consistent test for the treatment of the goodness-of-fit problem is developed for the background of huge or infinite dimensional data.
The Cramér-von-Mises distance is applied to the distribution of the excess over a confidence level. Asymptotics of related statistics are investigated, and it is seen that the obtained limit distributions differ from the classical ones. For that reason, quantiles of the new limit distributions are given and new bootstrap techniques for approximation purposes are introduced and justified. The results motivate new one-sample goodness-of-fit tests for the distribution of the excess over a confidence level and a new confidence interval for the related fitting error. Simulation studies investigate size and power of the tests as well as coverage probabilities of the confidence interval in the finite sample case. A practice-oriented application of the Cramér-von-Mises tests is the determination of an appropriate confidence level for the fitting approach. The adoption of the idea to the well-known problem of threshold detection in the context of peaks over threshold modelling is sketched and illustrated by data examples.
On the basis of independent and identically distributed bivariate random vectors, where the components are categorial and continuous variables, respectively, the related concomitants, also called induced order statistic, are considered. The main theoretical result is a functional central limit theorem for the empirical process of the concomitants in a triangular array setting. A natural application is hypothesis testing. An independence test and a two-sample test are investigated in detail. The fairly general setting enables limit results under local alternatives and bootstrap samples. For the comparison with existing tests from the literature simulation studies are conducted. The empirical results obtained confirm the theoretical findings.
An array of four independently wired indium tin oxide (ITO) electrodes was used for electrochemically stimulated DNA release and activation of DNA-based Identity, AND and XOR logic gates. Single-stranded DNA molecules were loaded on the mixed poly(N,N-dimethylaminoethyl methacrylate) (PDMAEMA)/poly(methacrylic acid) (PMAA) brush covalently attached to the ITO electrodes. The DNA deposition was performed at pH 5.0 when the polymer brush is positively charged due to protonation of tertiary amino groups in PDMAEMA, thus resulting in electrostatic attraction of the negatively charged DNA. By applying electrolysis at −1.0 V(vs. Ag/AgCl reference) electrochemical oxygen reduction resulted in the consumption of hydrogen ions and local pH increase near the electrode surface. The process resulted in recharging the polymer brush to the negative state due to dissociation of carboxylic groups of PMAA, thus repulsing the negatively charged DNA and releasing it from the electrode surface. The DNA release was performed in various combinations from different electrodes in the array assembly. The released DNA operated as input signals for activation of the Boolean logic gates. The developed system represents a step forward in DNA computing, combining for the first time DNA chemical processes with electronic input signals.
The present article describes a standard instrument for the continuous online determination of retinal vessel diameters, the commercially available retinal vessel analyzer. This report is intended to provide informed guidelines for measuring ocular blood flow with this system. The report describes the principles underlying the method and the instruments currently available, and discusses clinical protocol and the specific parameters measured by the system. Unresolved questions and the possible limitations of the technique are also discussed.
HisT/PLIER : A Two-Fold Provenance Approach for Grid-Enabled Scientific Workflows Using WS-VLAM
(2011)
An increasing number of applications target their executions on specific hardware like general purpose Graphics Processing Units. Some Cloud Computing providers offer this specific hardware so that organizations can rent such resources. However, outsourcing the whole application to the Cloud causes avoidable costs if only some parts of the application benefit from the specific expensive hardware. A partial execution of applications in the Cloud is a tradeoff between costs and efficiency. This paper addresses the demand for a consistent framework that allows for a mixture of on- and off-premise calculations by migrating only specific parts to a Cloud. It uses the concept of workflows to present how individual workflow tasks can be migrated to the Cloud whereas the remaining tasks are executed on-premise.
Experience has shown that a priori created static resource allocation plans are vulnerable to runtime deviations and hence often become uneconomic or highly exceed a predefined soft deadline. The assumption of constant task execution times during allocation planning is even more unlikely in a cloud environment where virtualized resources vary in performance. Revising the initially created resource allocation plan at runtime allows the scheduler to react on deviations between planning and execution. Such an adaptive rescheduling of a many-task application workflow is only feasible, when the planning time can be handled efficiently at runtime. In this paper, we present the static low-complexity resource allocation planning algorithm (LCP) applicable to efficiently schedule many-task scientific application workflows on cloud resources of different capabilities. The benefits of the presented algorithm are benchmarked against alternative approaches. The benchmark results show that LCP is not only able to compete against higher complexity algorithms in terms of planned costs and planned makespan but also outperforms them significantly by magnitudes of 2 to 160 in terms of required planning time. Hence, LCP is superior in terms of practical usability where low planning time is essential such as in our targeted online rescheduling scenario.
Schwermetallbestimmung mittels Widerstandsmessungen und Voltammetrie an Dünnschichtelektroden
(1998)
Trace metal determination by dc resistance changes of microstructured thin gold film electrodes
(1999)
Diese Studie beschäftigte sich mit der Dämpfungswirkung von Schienbeinschonern, wie sie beim Fußball zum Einsatz kommen. Sie wurde mit Hilfe eines Pendelhammers durchgeführt, der verschiedene Aufschlagkräfte auf die Schoner ermöglichte. Dabei wurde deutlich, dass Schienbeinschoner die beste Wirkung bei Maximalkräften unterhalb von 5kN erreichen können, dass bei größerer Belastung allerdings Verbesserungsbedarf besteht. Hierfür konnte, u.a. durch den Einsatz neuer Materialien, ein guter Ansatzpunkt im „adäquaten Zusammenspiel von Schale und Polsterung“ der Schoner gefunden werden. Die Untersuchung hat weiterhin gezeigt, dass zumindest teilweise eine deutliche Verbesserung der Dämpfungswirkung der Schienbeinschoner in den letzten Jahren erreicht werden konnte.
Background/Aims: Common systems for the quantification of cellular contraction rely on animal-based models, complex experimental setups or indirect approaches. The herein presented CellDrum technology for testing mechanical tension of cellular monolayers and thin tissue constructs has the potential to scale-up mechanical testing towards medium-throughput analyses. Using hiPS-Cardiac Myocytes (hiPS-CMs) it represents a new perspective of drug testing and brings us closer to personalized drug medication. Methods: In the present study, monolayers of self-beating hiPS-CMs were grown on ultra-thin circular silicone membranes and deflect under the weight of the culture medium. Rhythmic contractions of the hiPS-CMs induced variations of the membrane deflection. The recorded contraction-relaxation-cycles were analyzed with respect to their amplitudes, durations, time integrals and frequencies. Besides unstimulated force and tensile stress, we investigated the effects of agonists and antagonists acting on Ca²⁺ channels (S-Bay K8644/verapamil) and Na⁺ channels (veratridine/lidocaine). Results: The measured data and simulations for pharmacologically unstimulated contraction resembled findings in native human heart tissue, while the pharmacological dose-response curves were highly accurate and consistent with reference data. Conclusion: We conclude that the combination of the CellDrum with hiPS-CMs offers a fast, facile and precise system for pharmacological, toxicological studies and offers new preclinical basic research potential.
After a short introduction of a new nonconforming linear finite element on quadrilaterals recently developed by Park, we derive a dual weighted residual-based a posteriori error estimator (in the sense of Becker and Rannacher) for this finite element. By computing a corresponding dual solution we estimate the error with respect to a given target error functional. The reliability and efficiency of this estimator is analyzed in several numerical experiments.
A method for detecting and approximating fault lines or surfaces, respectively, or decision curves in two and three dimensions with guaranteed accuracy is presented. Reformulated as a classification problem, our method starts from a set of scattered points along with the corresponding classification algorithm to construct a representation of a decision curve by points with prescribed maximal distance to the true decision curve. Hereby, our algorithm ensures that the representing point set covers the decision curve in its entire extent and features local refinement based on the geometric properties of the decision curve. We demonstrate applications of our method to problems related to the detection of faults, to multi-criteria decision aid and, in combination with Kirsch’s factorization method, to solving an inverse acoustic scattering problem. In all applications we considered in this work, our method requires significantly less pointwise classifications than previously employed algorithms.
Recently, we introduced and mathematically analysed a new method for grid deformation (Grajewski et al., 2009) [15] we call basic deformation method (BDM) here. It generalises the method proposed by Liao et al. (Bochev et al., 1996; Cai et al., 2004; Liao and Anderson, 1992) [4], [6], [20]. In this article, we employ the BDM as core of a new multilevel deformation method (MDM) which leads to vast improvements regarding robustness, accuracy and speed. We achieve this by splitting up the deformation process in a sequence of easier subproblems and by exploiting grid hierarchy. Being of optimal asymptotic complexity, we experience speed-ups up to a factor of 15 in our test cases compared to the BDM. This gives our MDM the potential for tackling large grids and time-dependent problems, where possibly the grid must be dynamically deformed once per time step according to the user's needs. Moreover, we elaborate on implementational aspects, in particular efficient grid searching, which is a key ingredient of the BDM.
Analyzing electroencephalographic (EEG) time series can be challenging, especially with deep neural networks, due to the large variability among human subjects and often small datasets. To address these challenges, various strategies, such as self-supervised learning, have been suggested, but they typically rely on extensive empirical datasets. Inspired by recent advances in computer vision, we propose a pretraining task termed "frequency pretraining" to pretrain a neural network for sleep staging by predicting the frequency content of randomly generated synthetic time series. Our experiments demonstrate that our method surpasses fully supervised learning in scenarios with limited data and few subjects, and matches its performance in regimes with many subjects. Furthermore, our results underline the relevance of frequency information for sleep stage scoring, while also demonstrating that deep neural networks utilize information beyond frequencies to enhance sleep staging performance, which is consistent with previous research. We anticipate that our approach will be advantageous across a broad spectrum of applications where EEG data is limited or derived from a small number of subjects, including the domain of brain-computer interfaces.
Reliable automation of the labor-intensive manual task of scoring animal sleep can facilitate the analysis of long-term sleep studies. In recent years, deep-learning-based systems, which learn optimal features from the data, increased scoring accuracies for the classical sleep stages of Wake, REM, and Non-REM. Meanwhile, it has been recognized that the statistics of transitional stages such as pre-REM, found between Non-REM and REM, may hold additional insight into the physiology of sleep and are now under vivid investigation. We propose a classification system based on a simple neural network architecture that scores the classical stages as well as pre-REM sleep in mice. When restricted to the classical stages, the optimized network showed state-of-the-art classification performance with an out-of-sample F1 score of 0.95 in male C57BL/6J mice. When unrestricted, the network showed lower F1 scores on pre-REM (0.5) compared to the classical stages. The result is comparable to previous attempts to score transitional stages in other species such as transition sleep in rats or N1 sleep in humans. Nevertheless, we observed that the sequence of predictions including pre-REM typically transitioned from Non-REM to REM reflecting sleep dynamics observed by human scorers. Our findings provide further evidence for the difficulty of scoring transitional sleep stages, likely because such stages of sleep are under-represented in typical data sets or show large inter-scorer variability. We further provide our source code and an online platform to run predictions with our trained network.
In this article, we report on the heat-transfer resistance at interfaces as a novel, denaturation-based method to detect single-nucleotide polymorphisms in DNA. We observed that a molecular brush of double-stranded DNA grafted onto synthetic diamond surfaces does not notably affect the heat-transfer resistance at the solid-to-liquid interface. In contrast to this, molecular brushes of single-stranded DNA cause, surprisingly, a substantially higher heat-transfer resistance and behave like a thermally insulating layer. This effect can be utilized to identify ds-DNA melting temperatures via the switching from low- to high heat-transfer resistance. The melting temperatures identified with this method for different DNA duplexes (29 base pairs without and with built-in mutations) correlate nicely with data calculated by modeling. The method is fast, label-free (without the need for fluorescent or radioactive markers), allows for repetitive measurements, and can also be extended toward array formats. Reference measurements by confocal fluorescence microscopy and impedance spectroscopy confirm that the switching of heat-transfer resistance upon denaturation is indeed related to the thermal on-chip denaturation of DNA.
Beim Ausbau nachhaltiger, regenerativer Energieversorgung hat die Umwandlung von organischer Biomasse in Biogas ein großes Potential. Der zugrundeliegende, komplexe biologische Prozess wird noch immer unzureichend verstanden und bedarf systematischer Untersuchungen der Prozessparameter, um einen hohen Ertrag bei guter Gasqualität zu ermöglichen. Die Fragestellungen zur Entschlüsselung des Prozesses sind sowohl verfahrenstechnischer als auch mikrobiologischer Natur. Aus mikrobiologischer Sicht ist die Kenntnis der tatsächlich beteiligten prozesstragenden Mikroorganismen von erheblicher Bedeutung, aus verfahrenstechnischer Sicht die Kenntnis der physikalischen und chemischen Faktoren, welche die mikrobiologischen Prozesse und kontrollieren. Im Zusammenspiel aller dieser Parameter wird die Biogasbildung befördert oder behindert, bis zum Abbruch des Prozesses.
Eine mögliche Kontrollmethode ist die Messung der metabolischen Aktivität prozesstragender Organismen.
Diese soll, beruhend auf fundierten Prozessdaten, gewonnen durch eine Parallelanlage, mit einem lichtadressierbaren potentiometrischen Sensor-System (LAPS) realisiert werden. Dieser Sensor ist in der Lage, pH-Wert-änderungen zu detektieren, die durch den Stoffwechsel der auf dem Chip immobilisierten Organismen hervorgerufen werden, um eine Online-Überwachung von Biogasanlagen zu ermöglichen.
ComputerMathematik mit Maple
(2003)
Modern industry and multi-discipline projects require highly trained individuals with resilient science and engineering back-grounds. Graduates must be able to agilely apply excellent theoretical knowledge in their subject matter as well as essential practical “hands-on” knowledge of diverse working processes to solve complex problems. To meet these demands, university education follows the concept of Constructive Alignment and thus increasingly adopts the teaching of necessary practical skills to the actual industry requirements and assessment routines. However, a systematic approach to coherently align these three central teaching demands is strangely absent from current university curricula. We demonstrate the feasibility of implementing practical assessments in a regular theory-based examination, thus defining the term “blended assessment”. We assessed a course for natural science and engineering students pursuing a career in biomedical engineering, and evaluated the benefit of blended assessment exams for students and lecturers. Our controlled study assessed the physiological background of electrocardiograms (ECGs), the practical measurement of ECG curves, and their interpretation of basic pathologic alterations. To study on long time effects, students have been assessed on the topic twice with a time lag of 6 months. Our findings suggest a significant improvement in student gain with respect to practical skills and theoretical knowledge. The results of the reassessments support these outcomes. From the lecturers ́ point of view, blended assessment complements practical training courses while keeping organizational effort manageable. We consider blended assessment a viable tool for providing an improved student gain, industry-ready education format that should be evaluated and established further to prepare university graduates optimally for their future careers.
Mechanik: Vorlesung
(1987)