Article
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1594)
- Fachbereich Wirtschaftswissenschaften (705)
- Fachbereich Elektrotechnik und Informationstechnik (637)
- Fachbereich Energietechnik (609)
- Fachbereich Chemie und Biotechnologie (603)
- INB - Institut für Nano- und Biotechnologien (541)
- Fachbereich Maschinenbau und Mechatronik (492)
- IfB - Institut für Bioengineering (450)
- Fachbereich Luft- und Raumfahrttechnik (380)
- Fachbereich Bauingenieurwesen (333)
Language
Document Type
- Article (5659) (remove)
Keywords
- Einspielen <Werkstoff> (7)
- Multimediamarkt (6)
- Rapid prototyping (5)
- avalanche (5)
- Earthquake (4)
- FEM (4)
- Finite-Elemente-Methode (4)
- LAPS (4)
- Rapid Prototyping (4)
- additive manufacturing (4)
We present and discuss an exploration of the possibilities and properties of 3D printing with a printing space of 1 cubic meter, and how those can be integrated into architectural education through an experimental design and research course with students of architecture.We expand on issues presented at the eCAADe conference 2017 in Rome [Ref 6] by increasing the complexity and size of our prints, printing not a model to scale, but a full scale funtional prototype of a usable architectural object: A coffee bar.
To train end users how to interact with digital systems is indispensable to ensure a strong computer security. 'Competence Developing Game'-based approaches are particularly suitable for this purpose because of their motivation-and simulation-aspects. In this paper the Competence Developing Game 'GHOST' for cybersecurity awareness trainings and its underlying patterns are described. Accordingly, requirements for an 'Competence Developing Game' based training are discussed. Based on these requirements it is shown how a game can fulfill these requirements. A supplementary game interaction design and a corresponding evaluation study is shown. The combination of training requirements and interaction design is used to create a 'Competence Developing Game'-based training concept. A part of these concept is implemented into a playable prototype that serves around one hour of play respectively training time. This prototype is used to perform an evaluation of the game and training aspects of the awareness training. Thereby, the quality of the game aspect and the effectiveness of the training aspect are shown.
Recent Unmanned Aerial Vehicle (UAV) design procedures rely on full aircraft steady-state Reynolds-Averaged-Navier-Stokes (RANS) analyses in early design stages. Small sensor turrets are included in such simulations, even though their aerodynamic properties show highly unsteady behavior. Very little is known about the effects of this approach on the simulation outcomes of small turrets. Therefore, the flow around a model turret at a Reynolds number of 47,400 is simulated with a steady-state RANS approach and compared to experimental data. Lift, drag, and surface pressure show good agreement with the experiment. The RANS model predicts the separation location too far downstream and shows a larger recirculation region aft of the body. Both characteristic arch and horseshoe vortex structures are visualized and qualitatively match the ones found by the experiment. The Reynolds number dependence of the drag coefficient follows the trend of a sphere within a distinct range. The outcomes indicate that a steady-state RANS model of a small sensor turret is able to give results that are useful for UAV engineering purposes but might not be suited for detailed insight into flow properties.
Digital Image Correlation (DIC) is a powerful tool used to evaluate displacements and deformations in a non-intrusive manner. By comparing two images, one of the undeformed reference state of a specimen and another of the deformed target state, the relative displacement between those two states is determined. DIC is well known and often used for post-processing analysis of in-plane displacements and deformation of specimen. Increasing the analysis speed to enable real-time DIC analysis will be beneficial and extend the field of use of this technique.
Here we tested several combinations of the most common DIC methods in combination with different parallelization approaches in MATLAB and evaluated their performance to determine whether real-time analysis is possible with these methods. To reflect improvements in computing technology different hardware settings were also analysed. We found that implementation problems can reduce the efficiency of a theoretically superior algorithm such that it becomes practically slower than a suboptimal algorithm. The Newton-Raphson algorithm in combination with a modified Particle Swarm algorithm in parallel image computation was found to be most effective. This is contrary to theory, suggesting that the inverse-compositional Gauss-Newton algorithm is superior. As expected, the Brute Force Search algorithm is the least effective method. We also found that the correct choice of parallelization tasks is crucial to achieve improvements in computing speed. A poorly chosen parallelisation approach with high parallel overhead leads to inferior performance. Finally, irrespective of the computing mode the correct choice of combinations of integerpixel and sub-pixel search algorithms is decisive for an efficient analysis. Using currently available hardware realtime analysis at high framerates remains an aspiration.
Tribological performance of biodegradable lubricants under different surface roughness of tools
(2019)
A light-addressable potentiometric sensor (LAPS) is a field-effect-based (bio-) chemical sensor, in which a desired sensing area on the sensor surface can be defined by illumination. Light addressability can be used to visualize the concentration and spatial distribution of the target molecules, e.g., H+ ions. This unique feature has great potential for the label-free imaging of the metabolic activity of living organisms. The cultivation of those organisms needs specially tailored surface properties of the sensor. O2 plasma treatment is an attractive and promising tool for rapid surface engineering. However, the potential impacts of the technique are carefully investigated for the sensors that suffer from plasma-induced damage. Herein, a LAPS with a Ta2O5 pH-sensitive surface is successfully patterned by plasma treatment, and its effects are investigated by contact angle and scanning LAPS measurements. The plasma duration of 30 s (30 W) is found to be the threshold value, where excessive wettability begins. Furthermore, this treatment approach causes moderate plasma-induced damage, which can be reduced by thermal annealing (10 min at 300 °C). These findings provide a useful guideline to support future studies, where the LAPS surface is desired to be more hydrophilic by O2 plasma treatment.
Experience has shown that a priori created static resource allocation plans are vulnerable to runtime deviations and hence often become uneconomic or highly exceed a predefined soft deadline. The assumption of constant task execution times during allocation planning is even more unlikely in a cloud environment where virtualized resources vary in performance. Revising the initially created resource allocation plan at runtime allows the scheduler to react on deviations between planning and execution. Such an adaptive rescheduling of a many-task application workflow is only feasible, when the planning time can be handled efficiently at runtime. In this paper, we present the static low-complexity resource allocation planning algorithm (LCP) applicable to efficiently schedule many-task scientific application workflows on cloud resources of different capabilities. The benefits of the presented algorithm are benchmarked against alternative approaches. The benchmark results show that LCP is not only able to compete against higher complexity algorithms in terms of planned costs and planned makespan but also outperforms them significantly by magnitudes of 2 to 160 in terms of required planning time. Hence, LCP is superior in terms of practical usability where low planning time is essential such as in our targeted online rescheduling scenario.
Thermal and Optical Study on the Frequency Dependence of an Atmospheric Microwave Argon Plasma Jet
(2019)
Production and Characterization of Porous Fibroin Scaffolds for Regenerative Medical Application
(2019)
Kyphoplasty of Osteoporotic Fractured Vertebrae: A Finite Element Analysis about Two Types of Cement
(2019)
In this paper the results of a techno-economic analysis of improved and optimized molten salt solar tower plants (MSSTP plants) are presented. The potential improvements that were analyzed include different receiver designs, different designs of the HTF-system and plant control, increased molten salt temperatures (up to 640°C) and multi-tower systems. Detailed technological and economic models of the solar field, solar receiver and high temperature fluid system (HTF-system) were developed and used to find potential improvements compared to a reference plant based on Solar Two technology and up-to-date cost estimations. The annual yield model calculates the annual outputs and the LCOE of all variants. An improved external tubular receiver and improved HTF-system achieves a significant decrease of LCOE compared to the reference. This is caused by lower receiver cost as well as improvements of the HTF-system and plant operation strategy, significantly reducing the plant own consumption. A novel star receiver shows potential for further cost decrease. The cavity receiver concepts result in higher LCOE due to their high investment cost, despite achieving higher efficiencies. Increased molten salt temperatures seem possible with an adapted, closed loop HTF-system and achieve comparable results to the original improved system (with 565°C) under the given boundary conditions. In this analysis all multi tower systems show lower economic viability compared to single tower systems, caused by high additional cost for piping connections and higher cost of the receivers.
REFERENCES
We propose the so-called chance constrained programming model of stochastic programming theory to analyze limit and shakedown loads of structures under random strength with a lognormal distribution. A dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit and the shakedown limit. The edge-based smoothed finite element method (ES-FEM) is used with three-node linear triangular elements.
The molecular events during nongenotoxic carcinogenesis and their temporal order are poorly understood but thought to include long-lasting perturbations of gene expression. Here, we have investigated the temporal sequence of molecular and pathological perturbations at early stages of phenobarbital (PB) mediated liver tumor promotion in vivo. Molecular profiling (mRNA, microRNA [miRNA], DNA methylation, and proteins) of mouse liver during 13 weeks of PB treatment revealed progressive increases in hepatic expression of long noncoding RNAs and miRNAs originating from the Dlk1-Dio3 imprinted gene cluster, a locus that has recently been associated with stem cell pluripotency in mice and various neoplasms in humans. PB induction of the Dlk1-Dio3 cluster noncoding RNA (ncRNA) Meg3 was localized to glutamine synthetase-positive hypertrophic perivenous hepatocytes, sug- gesting a role for β-catenin signaling in the dysregulation of Dlk1-Dio3 ncRNAs. The carcinogenic relevance of Dlk1-Dio3 locus ncRNA induction was further supported by in vivo genetic dependence on constitutive androstane receptor and β-catenin pathways. Our data identify Dlk1-Dio3 ncRNAs as novel candidate early biomarkers for mouse liver tumor promotion and provide new opportunities for assessing the carcinogenic potential of novel compounds.
1. Drug metabolizing enzymes and transporters play important roles in the absorption, metabolism, tissue distribution and excretion of various compounds and their metabolites and thus can significantly affect their efficacy and safety. Furthermore, they can be involved in drug–drug interactions which can result in adverse responses, life-threatening toxicity or impaired efficacy. Significant species differences in the interaction of compounds with drug metabolizing enzymes and transporters have been described.
2. In order to overcome the limitation of animal models in accurately predicting human responses, a large variety of mouse models humanized for drug metabolizing enzymes and to a lesser extent drug transporters have been created.
3. This review summarizes the literature describing these mouse models and their key applications in studying the role of drug metabolizing enzymes and transporters in drug bioavailability, tissue distribution, clearance and drug–drug interactions as well as in human metabolite testing and risk assessment.
4. Though such humanized mouse models have certain limitations, there is great potential for their use in basic research and for testing and development of new medicines. These limitations and future potentials will be discussed.
Mice that have been genetically humanized for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging and promising in vivo models for an improved prediction of the pharmacokinetic, drug–drug interaction and safety characteristics of compounds in humans. The specific advantages and disadvantages of these models should be carefully considered when using them for studies in drug discovery and development. Here, an overview on the corresponding genetically humanized and chimeric liver humanized mouse models described to date is provided and illustrated with examples of their utility in drug metabolism and toxicity studies. We compare the strength and weaknesses of the two different approaches, give guidance for the selection of the appropriate model for various applications and discuss future trends and perspectives.