Refine
Year of publication
- 2016 (284) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (54)
- Fachbereich Chemie und Biotechnologie (44)
- Fachbereich Bauingenieurwesen (35)
- Fachbereich Elektrotechnik und Informationstechnik (35)
- IfB - Institut für Bioengineering (35)
- Fachbereich Wirtschaftswissenschaften (31)
- Fachbereich Luft- und Raumfahrttechnik (28)
- Fachbereich Maschinenbau und Mechatronik (26)
- Fachbereich Energietechnik (17)
- INB - Institut für Nano- und Biotechnologien (15)
Document Type
- Article (118)
- Conference Proceeding (81)
- Part of a Book (29)
- Book (25)
- Other (10)
- Conference: Meeting Abstract (8)
- Report (6)
- Doctoral Thesis (3)
- Part of a Periodical (2)
- Patent (1)
Keywords
- Technical Operations Research (2)
- Additive Manufacturing (1)
- Annulus Fibrosus (1)
- Archeology (1)
- Assessment (1)
- Asymptotic efficiency (1)
- Bacillus atrophaeus (1)
- Balance (1)
- Balanced hypergraph (1)
- Brandfall (1)
- Building Culture (1)
- Building Systems (1)
- Business Simulations (1)
- Cardiac myocytes (1)
- Cardiac tissue (1)
- Cartography (1)
- CellDrum (1)
- Censored data (1)
- Co-managed care (1)
- Collaborative robot (1)
- Computational biomechanics (1)
- Controller Parameter (1)
- DNA biosensor (1)
- Designpraxis (1)
- Disc Degeneration (1)
- Diversity (1)
- Drinking Water Supply (1)
- Drug simulation (1)
- Duality (1)
- EN 1993-1-2 (1)
- Effizienz (1)
- Einbetten in das Internet der Dinge (1)
- Elderly (1)
- Electromechanical modeling (1)
- Engineering Education (1)
- Eutectic Silver Copper alloy (1)
- External knee adduction moments (1)
- Fall prevention (1)
- Field effect (1)
- Forschung, pränormativ (1)
- Frequency adaption (1)
- Gamification (1)
- Gender (1)
- Ground-level falls (1)
- Hall’s Theorem (1)
- Heart tissue culture (1)
- Higher Education (1)
- Hodgkin–Huxley models (1)
- Homogenization (1)
- Human-Robot interaction (1)
- Hypergraph (1)
- Hypothesentests (1)
- Induced pluripotent stem cells (1)
- Infrastructures (1)
- Inotropic compounds (1)
- Intervertebral Disc (1)
- Intradiscal Pressure (1)
- Inverse dynamic problem (1)
- Inverse kinematic problem (1)
- Ion channels (1)
- Koenig’s Theorem (1)
- LAPS (1)
- Label-free detection (1)
- Land Survey and Measurement Systems (1)
- Layer-by-layer adsorption (1)
- Level Control System (1)
- Manipulated variables (1)
- Matching (1)
- Minimal-Ansatz für Embedded-Systeme (1)
- Mixed Integer Programming (1)
- Mixed-Integer Nonlinear Optimisation (1)
- Mobility (1)
- Mobility tests (1)
- Musculoskeletal model (1)
- Nucleus Pulposus (1)
- Optimal Closed Loop (1)
- Optimal Topology (1)
- Optimization (1)
- Parametric Design (1)
- Parametric Modelling (1)
- Path planning (1)
- Pharmacology (1)
- Poly(allylamine hydrochloride) (1)
- Porositat (1)
- Product-integration (1)
- Reference Process Model (1)
- Referenzmodellierung (1)
- Response Surface Method (1)
- SLM (1)
- Safety concept (1)
- Semi-parametric random censorship model (1)
- Serious Games (1)
- Solver Per- formance (1)
- Structuralist Architecture (1)
- Subject-oriented Business Process Management (1)
- Survival analysis (1)
- System Design (1)
- System Design Problem (1)
- TM Forum (1)
- TTIP (1)
- Technical Operation Research (1)
- Telecommunications Industry. (1)
- Tinetti test (1)
- Tragwerksbemessung (1)
- Unternehmensarchitektur (1)
- Unternehmenstransformation (1)
- Vergleich von Experimenten (1)
- Vertex cover (1)
- Volterra integral equation (1)
- Workspace monitoring (1)
- biosensors (1)
- building industry (1)
- chemical sensor (1)
- diversity (1)
- eTOM (1)
- efficiency (1)
- electronic communications markets (1)
- endospores (1)
- energy disspation (1)
- engineering education (1)
- enhanced Telecom Operations Map (eTOM) (1)
- friction (1)
- hiPS cardiomyocytes (1)
- immobilization (1)
- impulsive effects (1)
- industrial research (1)
- ingot (1)
- innovation (1)
- iron and steel industry (1)
- liberalisation (1)
- light-addressable potentiometric sensor (1)
- liquid-storage tank (1)
- liquid-structure interaction (1)
- materials technology (1)
- metal structure (1)
- organosilanes (1)
- plug-based microfluidic device (1)
- regulation (1)
- research project (1)
- research report (1)
- resistance of materials (1)
- seismic response (1)
- silanization (1)
- social responsible engineering (1)
- steel (1)
- structural design (1)
- structure-soil-structure interaction (1)
- testing hypotheses (1)
- wave run-up (1)
Background/Aims: Common systems for the quantification of cellular contraction rely on animal-based models, complex experimental setups or indirect approaches. The herein presented CellDrum technology for testing mechanical tension of cellular monolayers and thin tissue constructs has the potential to scale-up mechanical testing towards medium-throughput analyses. Using hiPS-Cardiac Myocytes (hiPS-CMs) it represents a new perspective of drug testing and brings us closer to personalized drug medication. Methods: In the present study, monolayers of self-beating hiPS-CMs were grown on ultra-thin circular silicone membranes and deflect under the weight of the culture medium. Rhythmic contractions of the hiPS-CMs induced variations of the membrane deflection. The recorded contraction-relaxation-cycles were analyzed with respect to their amplitudes, durations, time integrals and frequencies. Besides unstimulated force and tensile stress, we investigated the effects of agonists and antagonists acting on Ca²⁺ channels (S-Bay K8644/verapamil) and Na⁺ channels (veratridine/lidocaine). Results: The measured data and simulations for pharmacologically unstimulated contraction resembled findings in native human heart tissue, while the pharmacological dose-response curves were highly accurate and consistent with reference data. Conclusion: We conclude that the combination of the CellDrum with hiPS-CMs offers a fast, facile and precise system for pharmacological, toxicological studies and offers new preclinical basic research potential.
Autoradiography is a well-established method of nuclear imaging. When different radionuclides are present simultaneously, additional processing is needed to distinguish distributions of radionuclides. In this work, a method is presented where aluminium absorbers of different thickness are used to produce images with different cut-off energies. By subtracting images pixel-by-pixel one can generate images representing certain ranges of β-particle energies. The method is applied to the measurement of irradiated reactor graphite samples containing several radionuclides to determine the spatial distribution of these radionuclides within pre-defined energy windows. The process was repeated under fixed parameters after thermal treatment of the samples. The greyscale images of the distribution after treatment were subtracted from the corresponding pre-treatment images. Significant changes in the intensity and distribution of radionuclides could be observed in some samples. Due to the thermal treatment parameters the most significant differences were observed in the ³H and ¹⁴C inventory and distribution.
In the last decades, several hundred exoplanets could be detected thanks to space-based observatories, namely CNES’ COROT and NASA’s Kepler. To expand this quest ESA plans to launch CHEOPS as the f irst small class mission in the cosmic visions program (S1) and PLATO as the 3rd medium class mission, so called M3 . PLATO’s primary objective is the detection of Earth like Exoplanets orbiting solar type stars in the habitable zone and characterisation of their bulk properties. This is possible by precise lightcurve measurement via 34 cameras. That said it becomes obvious that accurate pointing is key to achieve the required signal to noise ratio for positive transit detection. The paper will start with a comprehensive overview of PLATO’s mission objectives and mission architecture. Hereafter, special focus will be devoted to PLATO’s pointing requirements. Understanding the very nature of PLATO’s pointing requirements is essential to derive a design baseline to achieve the required performance. The PLATO frequency domain is of particular interest, ranging from 40 mHz to 3 Hz. Due to the very different time-scales involved, the spectral pointing requirement is decomposed into a high frequency part dominated by the attitude control system and the low frequency part dominated by the thermo-elastic properties of the spacecraft’s configuration. Both pose stringent constraints on the overall design as well as technology properties to comply with the derived requirements and thus assure a successful mission.
Es werden Effizienzbegriffe zum Vergleich von statistischen Tests basierend auf verschiedenen statistischen Experimenten eingeführt. Dabei handelt es sich um die schon aus dem Vergleich von statistischen Tests in je demselben Modell bekannten asymptotischen relativen Effizienzen wie die Hodges-Lehmann-Effizienz, die Bahadur-Effizienz und die Pitman-Effizienz sowie um Kriterien basierend auf Volumina von Konfidenzbereichen. Effizienzaussagen werden unter anderem für Likelihood-Quotienten-Tests und Waldsche Tests im Rahmen eines allgemeinen multivariaten parametrischen Modells erhalten. Statistische Tests zur Prüfung von Hypothesen über die relative Wirksamkeit zweier Experimente werden vorgeschlagen. Auf der Grundlage der erhaltenen Ergebnisse erfolgt ein Vergleich der Wirksamkeit von korrespondierenden Verfahren bei verbundener Stichprobenerhebung und unabhängiger Stichprobenerhebung. Die Rolle der Kovarianzmatrix bei verbundener Stichprobenerhebung wird insbesondere unter der Annahme, dass die zugrunde liegenden Verteilungen durch k-parametrische Exponentialfamilien modellierbar sind, herausgearbeitet. Verbindungen zu Effizienzbegriffen bei Punkt- und Konfidenzbereichsschätzverfahren werden aufgezeigt. Ausführlichere Untersuchungen betreffen die korrespondierenden Hotellingschen T²-Tests im multivariaten Normalverteilungsfall, die klassischen Homogenitatstests bei k × k-Kontingenztafeln und die Wilcoxon Tests in nichtparametrischen Lagealternativmodellen
Combined with the use of renewable energy sources for its production, hydrogen represents a possible alternative gas turbine fuel for future low-emission power generation. Due to the difference in the physical properties of hydrogen compared to other fuels such as natural gas, well-established gas turbine combustion systems cannot be directly applied to dry low NOₓ (DLN) hydrogen combustion. The DLN micromix combustion of hydrogen has been under development for many years, since it has the promise to significantly reduce NOₓ emissions. This combustion principle for air-breathing engines is based on crossflow mixing of air and gaseous hydrogen. Air and hydrogen react in multiple miniaturized diffusion-type flames with an inherent safety against flashback and with low NOₓ emissions due to a very short residence time of the reactants in the flame region. The paper presents an advanced DLN micromix hydrogen application. The experimental and numerical study shows a combustor configuration with a significantly reduced number of enlarged fuel injectors with high-thermal power output at constant energy density. Larger fuel injectors reduce manufacturing costs, are more robust and less sensitive to fuel contamination and blockage in industrial environments. The experimental and numerical results confirm the successful application of high-energy injectors, while the DLN micromix characteristics of the design point, under part-load conditions, and under off-design operation are maintained. Atmospheric test rig data on NOₓ emissions, optical flame-structure, and combustor material temperatures are compared to numerical simulations and show good agreement. The impact of the applied scaling and design laws on the miniaturized micromix flamelets is particularly investigated numerically for the resulting flow field, the flame-structure, and NOₓ formation.
The Dry-Low-NOₓ (DLN) Micromix combustion technology has been developed as low emission combustion principle for industrial gas turbines fueled with hydrogen or syngas. The combustion process is based on the phenomenon of jet-in-crossflow-mixing. Fuel is injected perpendicular into the air-cross-flow and burned in a multitude of miniaturized, diffusion-like flames. The miniaturization of the flames leads to a significant reduction of NOₓ emissions due to the very short residence time of reactants in the flame.
In the Micromix research approach, CFD analyses are validated towards experimental results. The combination of numerical and experimental methods allows an efficient design and optimization of DLN Micromix combustors concerning combustion stability and low NOₓ emissions.
The paper presents a comparison of several numerical combustion models for hydrogen and hydrogen-rich syngas. They differ in the complexity of the underlying reaction mechanism and the associated computational effort.
For pure hydrogen combustion a one-step global reaction is applied using a hybrid Eddy-Break-up model that incorporates finite rate kinetics. The model is evaluated and compared to a detailed hydrogen combustion mechanism derived by Li et al. including 9 species and 19 reversible elementary reactions. Based on this mechanism, reduction of the computational effort is achieved by applying the Flamelet Generated Manifolds (FGM) method while the accuracy of the detailed reaction scheme is maintained.
For hydrogen-rich syngas combustion (H₂-CO) numerical analyses based on a skeletal H₂/CO reaction mechanism derived by Hawkes et al. and a detailed reaction mechanism provided by Ranzi et al. are performed.
The comparison between combustion models and the validation of numerical results is based on exhaust gas compositions available from experimental investigation on DLN Micromix combustors.
The conducted evaluation confirms that the applied detailed combustion mechanisms are able to predict the general physics of the DLN-Micromix combustion process accurately. The Flamelet Generated Manifolds method proved to be generally suitable to reduce the computational effort while maintaining the accuracy of detailed chemistry.
Especially for reaction mechanisms with a high number of species accuracy and computational effort can be balanced using the FGM model.
We present an electromechanically coupled computational model for the investigation of a thin cardiac tissue construct consisting of human-induced pluripotent stem cell-derived atrial, ventricular and sinoatrial cardiomyocytes. The mechanical and electrophysiological parts of the finite element model, as well as their coupling are explained in detail. The model is implemented in the open source finite element code Code_Aster and is employed for the simulation of a thin circular membrane deflected by a monolayer of autonomously beating, circular, thin cardiac tissue. Two cardio-active drugs, S-Bay K8644 and veratridine, are applied in experiments and simulations and are investigated with respect to their chronotropic effects on the tissue. These results demonstrate the potential of coupled micro- and macroscopic electromechanical models of cardiac tissue to be adapted to experimental results at the cellular level. Further model improvements are discussed taking into account experimentally measurable quantities that can easily be extracted from the obtained experimental results. The goal is to estimate the potential to adapt the presented model to sample specific cell cultures.
Im Jahr 2015 wurden in Deutschland über drei Millionen Benzinautos und lediglich 12.363 Elektroautos neu zugelassen. Das ursprünglich von der Bundesregierung vorgegebene Ziel, dass bis 2020 eine Million E-Autos auf deutschen Straßen fahren (und bis 2030 sechs Millionen), rückt damit in immer weitere Ferne. Um das Ziel dennoch zu erreichen, plant die Bundesregierung nun eine staatliche Prämie für den Kauf von Elektroautos: Umwelt-, Verkehrs- und Wirtschaftsministerium haben gemeinsam ein Konzept entworfen, dem zufolge private Käufer zukünftig einen Zuschuss von 5.000 Euro beim Erwerb eines Elektroautos bekommen sollen. 40 Prozent dieses Zuschusses soll von den Autoherstellern getragen werden. Das Programm, das weitere ausgabenwirksame öffentliche Maßnahmen vorsieht, würde Kosten in Milliardenhöhe verursachen. Die beabsichtigte Subventionierung wirft die Frage auf, ob diese wirtschaftlich sinnvoll sind.
To give the exchange of goods and services between the European Union (EU) and the United States (U.S.) new momentum the two parties are currently negotiating the transatlantic free trade agreement Transatlantic Trade and Investment Partnership (TTIP). The aim is to create the largest free trade area in the world. The agreement, once entered into force, will oblige EU countries and the U.S. to further liberalize their markets.
The negotiations on TTIP include a chapter on Electronic Communications/ Telecommunications. The challenge therein will be securing commitments for market access to Electronic Communications services. At the same time, these commitments must reflect the legitimate need for consumer protection issues. The need to reduce Electronic Communications-related non-tariff barriers to trade between the Parties is due to the fact that these markets are heavily regulated. Without transnational rules as to regulations national governments can abuse these regulations to deter the market entry by new (foreign) suppliers. Thus the free trade agreement TTIP affects in many respects regulatory provisions on and access to Electronic Communications markets. The objective of this paper is therefore to examine to what extend the regulatory principles for Electronic Communications markets envisaged under TTIP will result in trade facilitation and regulatory convergence between the EU and the U.S.
As to this question the result of the analysis is that the chapter on Electronic Communications will be an important step towards facilitating trade in Electronic Communications services. At the same time some regulatory convergence will take place, but this convergence will not lead to a (full) harmonization of regulations. Rather the norm, also after TTIP negotiations will have been concluded successfully, will be mutual recognition of different regulatory regimes. Different regulations being the optimal policy response in different market settings will continue to exist. Moreover, it is very unlikely that such regulatory principles for the Electronic Communications sector are a vehicle for a race to the bottom in levels of consumer protection.