Article
Refine
Year of publication
- 2024 (33)
- 2023 (54)
- 2022 (81)
- 2021 (90)
- 2020 (130)
- 2019 (119)
- 2018 (127)
- 2017 (109)
- 2016 (115)
- 2015 (125)
- 2014 (138)
- 2013 (138)
- 2012 (129)
- 2011 (178)
- 2010 (176)
- 2009 (197)
- 2008 (179)
- 2007 (170)
- 2006 (178)
- 2005 (182)
- 2004 (207)
- 2003 (147)
- 2002 (166)
- 2001 (154)
- 2000 (168)
- 1999 (153)
- 1998 (164)
- 1997 (151)
- 1996 (139)
- 1995 (147)
- 1994 (136)
- 1993 (108)
- 1992 (102)
- 1991 (74)
- 1990 (82)
- 1989 (79)
- 1988 (79)
- 1987 (77)
- 1986 (65)
- 1985 (59)
- 1984 (56)
- 1983 (47)
- 1982 (38)
- 1981 (39)
- 1980 (50)
- 1979 (43)
- 1978 (41)
- 1977 (22)
- 1976 (25)
- 1975 (18)
- 1974 (13)
- 1973 (6)
- 1972 (15)
- 1971 (7)
- 1970 (2)
- 1968 (2)
- 1967 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (1547)
- Fachbereich Wirtschaftswissenschaften (700)
- Fachbereich Elektrotechnik und Informationstechnik (629)
- Fachbereich Energietechnik (598)
- Fachbereich Chemie und Biotechnologie (590)
- INB - Institut für Nano- und Biotechnologien (524)
- Fachbereich Maschinenbau und Mechatronik (470)
- IfB - Institut für Bioengineering (428)
- Fachbereich Luft- und Raumfahrttechnik (368)
- Fachbereich Bauingenieurwesen (327)
Has Fulltext
- no (5530) (remove)
Language
Document Type
- Article (5530) (remove)
Keywords
- avalanche (5)
- Earthquake (4)
- LAPS (4)
- field-effect sensor (4)
- frequency mixing magnetic detection (4)
- CellDrum (3)
- Heparin (3)
- SLM (3)
- additive manufacturing (3)
- capacitive field-effect sensor (3)
Purpose — to compare the chemical elemental composition of vitreous cavity content taken from cadaveric eyes compared to samples taken from the eyes with terminal stage refractory glaucoma with decompensated intraocular pressure (IOP). Material and methods. The vitreous contents of the eyes from 2 groups were studied. The 1st group included 15 cadaveric eyes; the 2nd group included 15 eyes with refractory glaucoma in the terminal stage of the disease with decompensated IOP in patients with hypertension pain. The vitreal content samples were taken in the course of antiglaucoma surgery aimed at preserving the eye as an organ and involving employment of drainage in the vitreous cavity. The study of virtual contents was carried out on energy dispersive spectrometer Oxford X-Max 50 integrated into scanning electron microscope Zeiss EVO LS10. Results. Increased concentrations of Kalium and Phosphorus were detected in the vitreous content of cadaveric eyes compared with the vitreal content from the eyes with terminal glaucoma with decompensated IOP taken in vivo (K — 0.172/0.093; P — 0.045/0.025 mmol/L). In the vitreous cavity in the eyes with end-stage glaucoma with decompensated IOP, the concentration of Nitrogen was higher in comparison with human cadaver eyes (2.030/1.424 mmol/L). Conclusion. The increased concentrations of Kalium and Phosphorus in the vitreous content of cadaveric eyes is associated with postmortem autolytic processes and with the release of intracellular content in the destruction of cell membranes. The increased Nitrogen concentration in the vitreal contents of the eyes with terminal stage glaucoma with decompensated IOP may be associated with the presence of osmotically active nitrogen-containing compounds in the eyes with increased IOP.
The Carologistics team participates in the RoboCup Logistics League for the seventh year. The RCLL requires precise vision,
manipulation and path planning, as well as complex high-level decision
making and multi-robot coordination. We outline our approach with an
emphasis on recent modifications to those components.
The team members in 2018 are David Bosen, Christoph Gollok, Mostafa
Gomaa, Daniel Habering, Till Hofmann, Nicolas Limpert, Sebastian Schönitz,
Morian Sonnet, Carsten Stoffels, and Tarik Viehmann.
This paper is based on the last year’s team description.
Extrem hohe Blitzströme
(2018)
Blitze sind nach wie vor eine enorme Schadensquelle für Personenschäden, Brände, mechanische Zerstörungen und insbesondere auch Überspannungen. Das zeigen nicht zuletzt aktuelle Statistiken der Schadensversicherer. Immer wieder gibt es Meldungen über extrem hohe Blitzströme, die natürlich auch zu großen Schäden und Zerstörungen führen können. Dabei werden Scheitelwerte von teilweise deutlich über 300 kA genannt. Dies wirft Fragen auf, da die „klassische“ Blitzstatistik (z. B. nach CIGRE und IEC [8][10]) bisher solche Werte nicht kennt. Diese extremen Blitzströme werden meist aus den Daten von Blitzortungssystemen ermittelt.
We present and discuss an exploration of the possibilities and properties of 3D printing with a printing space of 1 cubic meter, and how those can be integrated into architectural education through an experimental design and research course with students of architecture.We expand on issues presented at the eCAADe conference 2017 in Rome [Ref 6] by increasing the complexity and size of our prints, printing not a model to scale, but a full scale funtional prototype of a usable architectural object: A coffee bar.
To train end users how to interact with digital systems is indispensable to ensure a strong computer security. 'Competence Developing Game'-based approaches are particularly suitable for this purpose because of their motivation-and simulation-aspects. In this paper the Competence Developing Game 'GHOST' for cybersecurity awareness trainings and its underlying patterns are described. Accordingly, requirements for an 'Competence Developing Game' based training are discussed. Based on these requirements it is shown how a game can fulfill these requirements. A supplementary game interaction design and a corresponding evaluation study is shown. The combination of training requirements and interaction design is used to create a 'Competence Developing Game'-based training concept. A part of these concept is implemented into a playable prototype that serves around one hour of play respectively training time. This prototype is used to perform an evaluation of the game and training aspects of the awareness training. Thereby, the quality of the game aspect and the effectiveness of the training aspect are shown.
Recent Unmanned Aerial Vehicle (UAV) design procedures rely on full aircraft steady-state Reynolds-Averaged-Navier-Stokes (RANS) analyses in early design stages. Small sensor turrets are included in such simulations, even though their aerodynamic properties show highly unsteady behavior. Very little is known about the effects of this approach on the simulation outcomes of small turrets. Therefore, the flow around a model turret at a Reynolds number of 47,400 is simulated with a steady-state RANS approach and compared to experimental data. Lift, drag, and surface pressure show good agreement with the experiment. The RANS model predicts the separation location too far downstream and shows a larger recirculation region aft of the body. Both characteristic arch and horseshoe vortex structures are visualized and qualitatively match the ones found by the experiment. The Reynolds number dependence of the drag coefficient follows the trend of a sphere within a distinct range. The outcomes indicate that a steady-state RANS model of a small sensor turret is able to give results that are useful for UAV engineering purposes but might not be suited for detailed insight into flow properties.
Digital Image Correlation (DIC) is a powerful tool used to evaluate displacements and deformations in a non-intrusive manner. By comparing two images, one of the undeformed reference state of a specimen and another of the deformed target state, the relative displacement between those two states is determined. DIC is well known and often used for post-processing analysis of in-plane displacements and deformation of specimen. Increasing the analysis speed to enable real-time DIC analysis will be beneficial and extend the field of use of this technique.
Here we tested several combinations of the most common DIC methods in combination with different parallelization approaches in MATLAB and evaluated their performance to determine whether real-time analysis is possible with these methods. To reflect improvements in computing technology different hardware settings were also analysed. We found that implementation problems can reduce the efficiency of a theoretically superior algorithm such that it becomes practically slower than a suboptimal algorithm. The Newton-Raphson algorithm in combination with a modified Particle Swarm algorithm in parallel image computation was found to be most effective. This is contrary to theory, suggesting that the inverse-compositional Gauss-Newton algorithm is superior. As expected, the Brute Force Search algorithm is the least effective method. We also found that the correct choice of parallelization tasks is crucial to achieve improvements in computing speed. A poorly chosen parallelisation approach with high parallel overhead leads to inferior performance. Finally, irrespective of the computing mode the correct choice of combinations of integerpixel and sub-pixel search algorithms is decisive for an efficient analysis. Using currently available hardware realtime analysis at high framerates remains an aspiration.
Tribological performance of biodegradable lubricants under different surface roughness of tools
(2019)
Experience has shown that a priori created static resource allocation plans are vulnerable to runtime deviations and hence often become uneconomic or highly exceed a predefined soft deadline. The assumption of constant task execution times during allocation planning is even more unlikely in a cloud environment where virtualized resources vary in performance. Revising the initially created resource allocation plan at runtime allows the scheduler to react on deviations between planning and execution. Such an adaptive rescheduling of a many-task application workflow is only feasible, when the planning time can be handled efficiently at runtime. In this paper, we present the static low-complexity resource allocation planning algorithm (LCP) applicable to efficiently schedule many-task scientific application workflows on cloud resources of different capabilities. The benefits of the presented algorithm are benchmarked against alternative approaches. The benchmark results show that LCP is not only able to compete against higher complexity algorithms in terms of planned costs and planned makespan but also outperforms them significantly by magnitudes of 2 to 160 in terms of required planning time. Hence, LCP is superior in terms of practical usability where low planning time is essential such as in our targeted online rescheduling scenario.
Thermal and Optical Study on the Frequency Dependence of an Atmospheric Microwave Argon Plasma Jet
(2019)
Production and Characterization of Porous Fibroin Scaffolds for Regenerative Medical Application
(2019)
Kyphoplasty of Osteoporotic Fractured Vertebrae: A Finite Element Analysis about Two Types of Cement
(2019)
In this paper the results of a techno-economic analysis of improved and optimized molten salt solar tower plants (MSSTP plants) are presented. The potential improvements that were analyzed include different receiver designs, different designs of the HTF-system and plant control, increased molten salt temperatures (up to 640°C) and multi-tower systems. Detailed technological and economic models of the solar field, solar receiver and high temperature fluid system (HTF-system) were developed and used to find potential improvements compared to a reference plant based on Solar Two technology and up-to-date cost estimations. The annual yield model calculates the annual outputs and the LCOE of all variants. An improved external tubular receiver and improved HTF-system achieves a significant decrease of LCOE compared to the reference. This is caused by lower receiver cost as well as improvements of the HTF-system and plant operation strategy, significantly reducing the plant own consumption. A novel star receiver shows potential for further cost decrease. The cavity receiver concepts result in higher LCOE due to their high investment cost, despite achieving higher efficiencies. Increased molten salt temperatures seem possible with an adapted, closed loop HTF-system and achieve comparable results to the original improved system (with 565°C) under the given boundary conditions. In this analysis all multi tower systems show lower economic viability compared to single tower systems, caused by high additional cost for piping connections and higher cost of the receivers.
REFERENCES
We propose the so-called chance constrained programming model of stochastic programming theory to analyze limit and shakedown loads of structures under random strength with a lognormal distribution. A dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit and the shakedown limit. The edge-based smoothed finite element method (ES-FEM) is used with three-node linear triangular elements.
The molecular events during nongenotoxic carcinogenesis and their temporal order are poorly understood but thought to include long-lasting perturbations of gene expression. Here, we have investigated the temporal sequence of molecular and pathological perturbations at early stages of phenobarbital (PB) mediated liver tumor promotion in vivo. Molecular profiling (mRNA, microRNA [miRNA], DNA methylation, and proteins) of mouse liver during 13 weeks of PB treatment revealed progressive increases in hepatic expression of long noncoding RNAs and miRNAs originating from the Dlk1-Dio3 imprinted gene cluster, a locus that has recently been associated with stem cell pluripotency in mice and various neoplasms in humans. PB induction of the Dlk1-Dio3 cluster noncoding RNA (ncRNA) Meg3 was localized to glutamine synthetase-positive hypertrophic perivenous hepatocytes, sug- gesting a role for β-catenin signaling in the dysregulation of Dlk1-Dio3 ncRNAs. The carcinogenic relevance of Dlk1-Dio3 locus ncRNA induction was further supported by in vivo genetic dependence on constitutive androstane receptor and β-catenin pathways. Our data identify Dlk1-Dio3 ncRNAs as novel candidate early biomarkers for mouse liver tumor promotion and provide new opportunities for assessing the carcinogenic potential of novel compounds.
1. Drug metabolizing enzymes and transporters play important roles in the absorption, metabolism, tissue distribution and excretion of various compounds and their metabolites and thus can significantly affect their efficacy and safety. Furthermore, they can be involved in drug–drug interactions which can result in adverse responses, life-threatening toxicity or impaired efficacy. Significant species differences in the interaction of compounds with drug metabolizing enzymes and transporters have been described.
2. In order to overcome the limitation of animal models in accurately predicting human responses, a large variety of mouse models humanized for drug metabolizing enzymes and to a lesser extent drug transporters have been created.
3. This review summarizes the literature describing these mouse models and their key applications in studying the role of drug metabolizing enzymes and transporters in drug bioavailability, tissue distribution, clearance and drug–drug interactions as well as in human metabolite testing and risk assessment.
4. Though such humanized mouse models have certain limitations, there is great potential for their use in basic research and for testing and development of new medicines. These limitations and future potentials will be discussed.
Mice that have been genetically humanized for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging and promising in vivo models for an improved prediction of the pharmacokinetic, drug–drug interaction and safety characteristics of compounds in humans. The specific advantages and disadvantages of these models should be carefully considered when using them for studies in drug discovery and development. Here, an overview on the corresponding genetically humanized and chimeric liver humanized mouse models described to date is provided and illustrated with examples of their utility in drug metabolism and toxicity studies. We compare the strength and weaknesses of the two different approaches, give guidance for the selection of the appropriate model for various applications and discuss future trends and perspectives.