Filtern
Erscheinungsjahr
- 2024 (23)
- 2023 (29)
- 2022 (46)
- 2021 (53)
- 2020 (57)
- 2019 (65)
- 2018 (60)
- 2017 (61)
- 2016 (43)
- 2015 (61)
- 2014 (58)
- 2013 (54)
- 2012 (59)
- 2011 (71)
- 2010 (62)
- 2009 (73)
- 2008 (53)
- 2007 (45)
- 2006 (64)
- 2005 (40)
- 2004 (75)
- 2003 (46)
- 2002 (46)
- 2001 (48)
- 2000 (51)
- 1999 (29)
- 1998 (25)
- 1997 (25)
- 1996 (21)
- 1995 (16)
- 1994 (11)
- 1993 (16)
- 1992 (7)
- 1991 (5)
- 1990 (11)
- 1989 (11)
- 1988 (17)
- 1987 (6)
- 1986 (2)
- 1985 (3)
- 1984 (1)
- 1983 (2)
- 1982 (20)
- 1981 (13)
- 1980 (27)
- 1979 (18)
- 1978 (26)
- 1977 (13)
- 1976 (12)
- 1975 (9)
- 1974 (2)
- 1973 (1)
- 1972 (2)
- 1968 (1)
Institut
- Fachbereich Medizintechnik und Technomathematik (1695) (entfernen)
Sprache
- Englisch (1695) (entfernen)
Dokumenttyp
- Wissenschaftlicher Artikel (1352)
- Konferenzveröffentlichung (217)
- Teil eines Buches (Kapitel) (44)
- Buch (Monographie) (43)
- Dissertation (18)
- Sonstiges (6)
- Konferenz: Meeting Abstract (4)
- Patent (4)
- Preprint (3)
- Vorlesung (2)
Schlagworte
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- FEM (6)
- Limit analysis (6)
- Shakedown analysis (6)
- shakedown analysis (6)
A Classical Reformulation of Finite-Dimensional Quantum Mechanics. Hellwig, K.-E.; Stulpe, W.
(1993)
The readout of gamma detectors is considerably simplified when the event intensity is encoded as a pulse width (Pulse Width Modulation, PWM). Time-to-Digital-Converters (TDC) replace the conventional ADCs and multiple TDCs can be realized easily in one PLD chip (Programmable Logic Device). The output of a PWM stage is only one digital signal per channel which is well suited for transport so that further processing can be performed apart from the detector. This is particularly interesting for large systems with high channel density (e.g. high resolution scanners). In this work we present a circuit with a linear transfer function that requires a minimum of components by performing the PWM already in the preamp stage. This allows a very compact and also cost-efficient implementation of the front-end electronics.
A concept for a sensitive micro total analysis system for high throughput fluorescence imaging
(2006)
This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis systems (µTAS). The first method relates to side illumination of the fluorescent material placed into microcompartments of the lab-on-chip. Its significance is in high utilization of excitation energy for low concentration of fluorescent material. The utilization of a transparent µLED chip, for the second method, allows the placement of the excitation light sources on the same optical axis with emission detector, such that the excitation and emission rays are directed controversly. The third method presents a spatial filtering of the excitation background.
A melting probe equipped with autofluorescence-based detection system combined with a light scattering unit, and, optionally, with a microarray chip would be ideally suited to probe icy environments like Europa’s ice layer as well as the polar ice layers of Earth and Mars for recent and extinct live.
A nonparametric goodness-of-fit test for random variables with values in a separable Hilbert space is investigated. To verify the null hypothesis that the data come from a specific distribution, an integral type test based on a Cramér-von-Mises statistic is suggested. The convergence in distribution of the test statistic under the null hypothesis is proved and the test's consistency is concluded. Moreover, properties under local alternatives are discussed. Applications are given for data of huge but finite dimension and for functional data in infinite dimensional spaces. A general approach enables the treatment of incomplete data. In simulation studies the test competes with alternative proposals.
Companies often build their businesses based on product information and therefore try to automate the process of information extraction (IE). Since the information source is usually heterogeneous and non-standardized, classic extract, transform, load techniques reach their limits. Hence, companies must implement the newest findings from research to tackle the challenges of process automation. They require a flexible and robust system that is extendable and ensures the optimal processing of the different document types. This paper provides a distributed microservice architecture pattern that enables the automated generation of IE pipelines. Since their optimal design is individual for each input document, the system ensures the ad-hoc generation of pipelines depending on specific document characteristics at runtime. Furthermore, it introduces the automated quality determination of each available pipeline and controls the integration of new microservices based on their impact on the business value. The introduced system enables fast prototyping of the newest approaches from research and supports companies in automating their IE processes. Based on the automated quality determination, it ensures that the generated pipelines always meet defined business requirements when they come into productive use.
A Formulation of Quantum Stochastic Processes and Some of its Properties. Hellwig, K.-E.; Stulpe, W.
(1983)
A generalized shear-lag theory for fibres with variable radius is developed to analyse elastic fibre/matrix stress transfer. The theory accounts for the reinforcement of biological composites, such as soft tissue and bone tissue, as well as for the reinforcement of technical composite materials, such as fibre-reinforced polymers (FRP). The original shear-lag theory proposed by Cox in 1952 is generalized for fibres with variable radius and with symmetric and asymmetric ends. Analytical solutions are derived for the distribution of axial and interfacial shear stress in cylindrical and elliptical fibres, as well as conical and paraboloidal fibres with asymmetric ends. Additionally, the distribution of axial and interfacial shear stress for conical and paraboloidal fibres with symmetric ends are numerically predicted. The results are compared with solutions from axisymmetric finite element models. A parameter study is performed, to investigate the suitability of alternative fibre geometries for use in FRP.
On the basis of bivariate data, assumed to be observations of independent copies of a random vector (S,N), we consider testing the hypothesis that the distribution of (S,N) belongs to the parametric class of distributions that arise with the compound Poisson exponential model. Typically, this model is used in stochastic hydrology, with N as the number of raindays, and S as total rainfall amount during a certain time period, or in actuarial science, with N as the number of losses, and S as total loss expenditure during a certain time period. The compound Poisson exponential model is characterized in the way that a specific transform associated with the distribution of (S,N) satisfies a certain differential equation. Mimicking the function part of this equation by substituting the empirical counterparts of the transform we obtain an expression the weighted integral of the square of which is used as test statistic. We deal with two variants of the latter, one of which being invariant under scale transformations of the S-part by fixed positive constants. Critical values are obtained by using a parametric bootstrap procedure. The asymptotic behavior of the tests is discussed. A simulation study demonstrates the performance of the tests in the finite sample case. The procedure is applied to rainfall data and to an actuarial dataset. A multivariate extension is also discussed.
A high-Q resonance-mode measurement of EIS capacitive sensor by elimination of series resistance
(2017)
An EIS capacitive sensor is a semiconductor-based potentiometric sensor, which is sensitive to the ion concentration or pH value of the solution in contact with the sensing surface. To detect a small change in the ion concentration or pH, a small capacitance change must be detected. Recently, a resonance-mode measurement was proposed, in which an inductor was connected to the EIS capacitive sensor and the resonant frequency was correlated with the pH value. In this study, the Q factor of the resonant circuit was enhanced by canceling the internal resistance of the reference electrode and the internal resistance of the inductor coil with the help of a bypass capacitor and a negative impedance converter, respectively. 1% variation of the signal in the developed system corresponded to a pH change of 3.93 mpH, which was about 1/12 of the conventional method, suggesting a better performance in detection of a small pH change.
The importance of validating and reproducing the outcome of computational processes is fundamental to many application domains. Assuring the provenance of workflows will likely become even more important with respect to the incorporation of human tasks to standard workflows by emerging standards such as WS-HumanTask. This paper addresses this trend by an actor-based workflow approach that actively support provenance. It proposes a framework to track and store provenance information automatically that applies for various workflow management systems. In particular, the introduced provenance framework supports the documentation of workflows in a legally binding way. The authors therefore use the concept of layered XML documents, i.e. history-tracing XML. Furthermore, the proposed provenance framework enables the executors (actors) of a particular workflow task to attest their operations and the associated results by integrating digital XML signatures.
Monitoring the cellular metabolism of bacteria in (bio)fermentation processes is crucial to control and steer them, and to prevent undesired disturbances linked to metabolically inactive microorganisms. In this context, cell-based biosensors can play an important role to improve the quality and increase the yield of such processes. This work describes the simultaneous analysis of the metabolic behavior of three different types of bacteria by means of a differential light-addressable potentiometric sensor (LAPS) set-up. The study includes Lactobacillus brevis, Corynebacterium glutamicum, and Escherichia coli, which are often applied in fermentation processes in bioreactors. Differential measurements were carried out to compensate undesirable influences such as sensor signal drift, and pH value variation during the measurements. Furthermore, calibration curves of the cellular metabolism were established as a function of the glucose concentration or cell number variation with all three model microorganisms. In this context, simultaneous (bio)sensing with the multi-organism LAPS-based set-up can open new possibilities for a cost-effective, rapid detection of the extracellular acidification of bacteria on a single sensor chip. It can be applied to evaluate the metabolic response of bacteria populations in a (bio)fermentation process, for instance, in the biogas fermentation process.
Results are presented on the ratios of the nucleon structure function in copper to deuterium from two separate experiments. The data confirm that the nucleon structure function,F 2, is different for bound nucleons than for the quasi-free ones in the deuteron. The redistribution in the fraction of the nucleon's momentum carried by quarks is investigated and it is found that the data are compatible with no integral loss of quark momenta due to nuclear effects.
The spin asymmetry in deep inelastic scattering of longitudinally polarised muons by longitudinally polarised protons has been measured over a large x range (0.01<x<0.7). The spin-dependent structure function g1(x) for the proton has been determined and its integral over x found to be 0.114±0.012±0.026, in disagreement with the Ellis-Jaffe sum rule. Assuming the validity of the Bjorken sum rule, this result implies a significant negative value for the integral of g1 for the neutron. These values for the integrals of g1 lead to the conclusion that the total quark spin constitutes a rather small fraction of the spin of the nucleon.
A microscopic photometric method for measuring erythrocyte deformability. Artmann, Gerhard Michael
(1986)
Purpose
Two semi-empirical models were recently published, both making use of existing literature data, but each taking into account different physical phenomena that trigger hemolysis. In the first model, hemoglobin (Hb) release is described as a permeation procedure across the membrane, assuming a shear stress-dependent process (sublethal model). The second model only accounts for hemoglobin release that is caused by cell membrane breakdown, which occurs when red blood cells (RBC) undergo mechanically induced shearing for a period longer than the threshold time (nonuniform threshold model). In this paper, we introduce a model that considers the hemolysis generated by both these possible phenomena.
Methods
Since hemolysis can possibly be caused by permeation of hemoglobin through the RBC functional membrane as well as by release of hemoglobin from RBC membrane breakdown, our proposed model combines both these models. An experimental setup consisting of a Couette device was utilized for validation of our proposed model.
Results
A comparison is presented between the damage index (DI) predicted by the proposed model vs. the sublethal model vs. the nonthreshold model and experimental datasets. This comparison covers a wide range of shear stress for both human and porcine blood. An appropriate agreement between the measured DI and the DI predicted by the present model was obtained.
Conclusions
The semiempirical hemolysis model introduced in this paper aims for significantly enhanced conformity with experimental data. Two phenomenological outcomes become possible with the proposed approach: an estimation of the average time after which cell membrane breakdown occurs under the applied conditions, and a prediction of the ratio between the phenomena involved in hemolysis.
A New Class of Biosensors Based on Tobacco Mosaic Virus and Coat Proteins as Enzyme Nanocarrier
(2016)
A new in vitro tool to investigate cardiac contractility under physiological mechanical conditions
(2019)
Background: To elaborate the impact of new haemostatic agents we developed an instrument for the pressure-controlled induction of blunt liver injuries in a porcine animal model. Materials and Methods: A dilutional coagulopathy of 80% of animal blood volume was induced in 9 anaesthetized pigs. Animals were randomly assigned to be injured with a force of 112 Newton (N) (n = 1), 224 ± 19 N (n = 4) or 355 ± 35 N (n = 4). The impact of injury was measured by blood loss, survival time and coagulation parameters. Liver histology was obtained to evaluate the degree of liver injury. Results: The profound haemodilution resulted in a significant alteration of all coagulation parameters. After inflicting the injury with 355 ± 35 N, both the survival time (30 ± 9 min; p = 0.006) and blood loss (68 ± 16 ml min–1, p = 0.002) were significantly different as compared to injuries with 224 ± 19 N (survival time: 76 ± 20 min, blood loss: 23 ± 4 ml min–1). In contrast, an injury with 112 N led to an insignificant blood loss of only 239 ml. Conclusion: We developed a pressure-controlled clamp that allows for the induction of blunt liver traumas with highly reproducible injuries with a positive correlation with blood loss and survival.