Refine
Year of publication
- 2016 (158) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (43)
- Fachbereich Chemie und Biotechnologie (29)
- IfB - Institut für Bioengineering (29)
- Fachbereich Elektrotechnik und Informationstechnik (26)
- Fachbereich Luft- und Raumfahrttechnik (22)
- Fachbereich Bauingenieurwesen (13)
- INB - Institut für Nano- und Biotechnologien (13)
- Fachbereich Maschinenbau und Mechatronik (12)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (11)
- Fachbereich Energietechnik (10)
Language
- English (158) (remove)
Document Type
- Article (78)
- Conference Proceeding (66)
- Part of a Book (7)
- Conference: Meeting Abstract (3)
- Book (2)
- Doctoral Thesis (1)
- Report (1)
Keywords
- Technical Operations Research (2)
- Additive Manufacturing (1)
- Annulus Fibrosus (1)
- Assessment (1)
- Asymptotic efficiency (1)
- Bacillus atrophaeus (1)
- Balance (1)
- Balanced hypergraph (1)
- Building Systems (1)
- Business Simulations (1)
Characterization and evaluation of lignocellulosic biomass 130 hydrolysates for ABE fermentation
(2016)
Combined with the use of renewable energy sources for its production, Hydrogen represents a possible alternative gas turbine fuel within future low emission power generation. Due to the large difference in the physical properties of Hydrogen compared to other fuels such as natural gas, well established gas turbine combustion systems cannot be directly applied for Dry Low NOx (DLN) Hydrogen combustion. Thus, the development of DLN combustion technologies is an essential and challenging task for the future of Hydrogen fuelled gas turbines. The DLN Micromix combustion principle for hydrogen fuel has been developed to significantly reduce NOx-emissions. This combustion principle is based on cross-flow mixing of air and gaseous hydrogen which reacts in multiple miniaturized diffusion-type flames. The major advantages of this combustion principle are the inherent safety against flash-back and the low NOx-emissions due to a very short residence time of reactants in the flame region of the micro-flames. The Micromix Combustion technology has been already proven experimentally and numerically for pure Hydrogen fuel operation at different energy density levels. The aim of the present study is to analyze the influence of different geometry parameter variations on the flame structure and the NOx emission and to identify the most relevant design parameters, aiming to provide a physical understanding of the Micromix flame sensitivity to the burner design and identify further optimization potential of this innovative combustion technology while increasing its energy density and making it mature enough for real gas turbine application. The study reveals great optimization potential of the Micromix Combustion technology with respect to the DLN characteristics and gives insight into the impact of geometry modifications on flame structure and NOx emission. This allows to further increase the energy density of the Micromix burners and to integrate this technology in industrial gas turbines.
This paper describes the development of a capacitively coupled high-pressure lamp with input power between 20 and 43 W at 2.45 GHz, using a coaxial line network. Compared with other electrodeless lamp systems, no cavity has to be used and a reduction in the input power is achieved. Therefore, this lamp is an alternative to the halogen incandescent lamp for domestic lighting. To serve the demands of domestic lighting, the filling of the lamp is optimized over all other resulting requirements, such as high efficacy at low induced powers and fast startups. A workflow to develop RF-driven plasma applications is presented, which makes use of the hot S-parameter technique. Descriptions of the fitting process inside a circuit and FEM simulator are given. Results of the combined ignition and operation network from simulations and measurements are compared. An initial prototype is built and measurements of the lamp's lighting properties are presented along with an investigation of the efficacy optimizations using large signal amplitude modulation. With this lamp, an efficacy of 135 lmW -1 is achieved.
To better understand what kinds of sports and exercise could be beneficial for the intervertebral disc (IVD), we performed a review to synthesise the literature on IVD adaptation with loading and exercise. The state of the literature did not permit a systematic review; therefore, we performed a narrative review. The majority of the available data come from cell or whole-disc loading models and animal exercise models. However, some studies have examined the impact of specific sports on IVD degeneration in humans and acute exercise on disc size. Based on the data available in the literature, loading types that are likely beneficial to the IVD are dynamic, axial, at slow to moderate movement speeds, and of a magnitude experienced in walking and jogging. Static loading, torsional loading, flexion with compression, rapid loading, high-impact loading and explosive tasks are likely detrimental for the IVD. Reduced physical activity and disuse appear to be detrimental for the IVD. We also consider the impact of genetics and the likelihood of a ‘critical period’ for the effect of exercise in IVD development. The current review summarises the literature to increase awareness amongst exercise, rehabilitation and ergonomic professionals regarding IVD health and provides recommendations on future directions in research.
Given the strong increase in regulatory requirements for business processes the management of business process compliance becomes a more and more regarded field in IS research. Several methods have been developed to support compliance checking of conceptual models. However, their focus on distinct modeling languages and mostly linear (i.e., predecessor-successor related) compliance rules may hinder widespread adoption and application in practice. Furthermore, hardly any of them has been evaluated in a real-world setting. We address this issue by applying a generic pattern matching approach for conceptual models to business process compliance checking in the financial sector. It consists of a model query language, a search algorithm and a corresponding modelling tool prototype. It is (1) applicable for all graph-based conceptual modeling languages and (2) for different kinds of compliance rules. Furthermore, based on an applicability check, we (3) evaluate the approach in a financial industry project setting against its relevance for decision support of audit and compliance management tasks.
The main objective of the BATIMASS project was to address how the energy balance in relatively lightweight steel buildings can be improved by building in ‘active thermal mass’ (ATM) into the building fabric. This was achieved through concept design, dynamic thermal modelling and testing of a number of potentially viable systems and concepts. A significant programme of thermal simulation modelling was undertaken utilising the thermally equivalent slab (TES) concept to model the passive thermal capacity effect of profiled, composite metal floor decks. It is apparent from the modelling results that thermal mass is a highly complex phenomenon which is highly dependent upon building type, occupancy patterns, climate and many other aspects of the building design and servicing strategy. The ATM systems developed, both conceptually and for prototype testing, focussed on water-cooled composite slabs, the Cofradal floor system and the phase change material (PCM) Energain. In addition to laboratory testing of prototypes, whole building monitoring was undertaken at the Kubik building in Spain and the RWTH test building in Germany. Advanced thermal modelling was also undertaken to estimate the likely benefits of the ATM concept designs developed and for comparison with the test results. In addition to thermal testing, structural tests were conducted on composite floor specimens incorporating embedded water pipes. This Final Report presents the results of the activities carried out under this RFCS contract RFSR CT 2012 00033. The work carried out is reported in six major sections corresponding to the technical Work Packages of the project. Only summaries of the work carried out are provided in this report; all work undertaken is fully reported in the formal project deliverables.
Wind-induced operational variability is one of the major challenges for structural health monitoring of slender engineering structures like aircraft wings or wind turbine blades. Damage sensitive features often show an even bigger sensitivity to operational variability. In this study a composite cantilever was subjected to multiple mass configurations, velocities and angles of attack in a controlled wind tunnel environment. A small-scale impact damage was introduced to the specimen and the structural response measurements were repeated. The proposed damage detection methodology is based on automated operational modal analysis. A novel baseline preparation procedure is described that reduces the amount of user interaction to the provision of a single consistency threshold. The procedure starts with an indeterminate number of operational modal analysis identifications from a large number of datasets and returns a complete baseline matrix of natural frequencies and damping ratios that is suitable for subsequent anomaly detection. Mahalanobis distance-based anomaly detection is then applied to successfully detect the damage under varying severities of operational variability and with various degrees of knowledge about the present operational conditions. The damage detection capabilities of the proposed methodology were found to be excellent under varying velocities and angles of attack. Damage detection was less successful under joint mass and wind variability but could be significantly improved through the provision of the currently encountered operational conditions.
The Monte Carlo code FLUKA is used to simulate the production of a number of positron emitting radionuclides, ¹⁸F, ¹³N, ⁹⁴Tc, ⁴⁴Sc, ⁶⁸Ga, ⁸⁶Y, ⁸⁹Zr, ⁵²Mn, ⁶¹Cu and ⁵⁵Co, on a small medical cyclotron with a proton beam energy of 13 MeV. Experimental data collected at the TR13 cyclotron at TRIUMF agree within a factor of 0.6 ± 0.4 with the directly simulated data, except for the production of ⁵⁵Co, where the simulation underestimates the experiment by a factor of 3.4 ± 0.4. The experimental data also agree within a factor of 0.8 ± 0.6 with the convolution of simulated proton fluence and cross sections from literature. Overall, this confirms the applicability of FLUKA to simulate radionuclide production at 13 MeV proton beam energy.
For the successful implementation of microfluidic reaction systems, such as PCR and electrophoresis, the movement of small liquid volumes is essential. In conventional lab-on-a-chip-platforms, solvents and samples are passed through defined microfluidic channels with complex flow control installations. The droplet actuation platform presented here is a promising alternative. With it, it is possible to move a liquid drop (microreactor) on a planar surface of a reaction platform (lab-in-a-drop). The actuation of microreactors on the hydrophobic surface of the platform is based on the use of magnetic forces acting on the outer shell of the liquid drops which is made of a thin layer of superhydrophobic magnetite particles. The hydrophobic surface of the platform is needed to avoid any contact between the liquid core and the surface to allow a smooth movement of the microreactor. On the platform, one or more microreactors with volumes of 10 µL can be positioned and moved simultaneously. The platform itself consists of a 3 x 3 matrix of electrical double coils which accommodate either neodymium or iron cores. The magnetic field gradients are automatically controlled. By variation of the magnetic field gradients, the microreactors' magnetic hydrophobic shell can be manipulated automatically to move the microreactor or open the shell reversibly. Reactions of substrates and corresponding enzymes can be initiated by merging the microreactors or bringing them into contact with surface immobilized catalysts.
Application of the optical flow method to velocity determination in hydraulic structure models
(2016)
Analysis of the long-term effect of the MBST® nuclear magnetic resonance therapy on gonarthrosis
(2016)
Autoradiography is a well-established method of nuclear imaging. When different radionuclides are present simultaneously, additional processing is needed to distinguish distributions of radionuclides. In this work, a method is presented where aluminium absorbers of different thickness are used to produce images with different cut-off energies. By subtracting images pixel-by-pixel one can generate images representing certain ranges of β-particle energies. The method is applied to the measurement of irradiated reactor graphite samples containing several radionuclides to determine the spatial distribution of these radionuclides within pre-defined energy windows. The process was repeated under fixed parameters after thermal treatment of the samples. The greyscale images of the distribution after treatment were subtracted from the corresponding pre-treatment images. Significant changes in the intensity and distribution of radionuclides could be observed in some samples. Due to the thermal treatment parameters the most significant differences were observed in the ³H and ¹⁴C inventory and distribution.
Replacement tissues, designed to fill in articular cartilage defects, should exhibit the same properties as the native material. The aim of this study is to foster the understanding of, firstly, the mechanical behavior of the material itself and, secondly, the influence of cultivation parameters on cell seeded implants as well as on cell migration into acellular implants. In this study, acellular cartilage replacement material is theoretically, numerically and experimentally investigated regarding its viscoelastic properties, where a phenomenological model for practical applications is developed. Furthermore, remodeling and cell migration are investigated.
Nach Stand von Wissenschaft und Technik werden Komponenten hinsichtlich ihrer Eigenschaften, wie Lebensdauer oder Energieeffizienz, optimiert. Allerdings können selbst hervorragende Komponenten zu ineffizienten oder instabilen Systemen führen, wenn ihr Zusammenspiel nur unzureichend berücksichtigt wird. Eine Systembetrachtung schafft ein größeres Optimierungspotential - dem erhöhten Potential steht jedoch auch ein erhöhter Komplexitätsgrad gegenüber. Die vorliegende Arbeit ist im Rahmen des Sonderforschungsbereichs 805 entstanden, dessen Ziel die Beherrschung von Unsicherheit in Systemen des Maschinenbaus ist. Die Arbeit zeigt anhand eines realen Systems aus dem Bereich der Hydraulik, wie Unsicherheit in der Entwicklungsphase beherrscht werden kann. Hierbei ist neu, dass die durch den späteren Betrieb zu erwartende Systemdegradation eines jeden möglichen Systemvorschlags antizipiert werden kann. Dadurch können Betriebs- und Wartungskosten vorausgesagt und minimiert werden und durch eine optimale Betriebs- und Wartungsstrategie die Verfügbarkeit des Systems garantiert werden. Wesentliche Fragen bei der optimalen Auslegung des betrachteten hydrostatischen Getriebes sind dessen physikalische Modellierung, die Darstellung des Optimierungsproblems als gemischt-ganzzahliges lineares Programm, und dessen algorithmische Behandlung zur Lösungsfindung. Hierzu werden Heuristiken zum schnelleren Auffinden sinnvoller Systemtopologien vorgestellt und mittels mathematischer Dekomposition eine Bewertung des dynamischen Verschleiß- und Wartungsverlaufs möglicher Systemvorschläge vorgenommen. Die Arbeit stellt die Optimierung technischer Systeme an der Schnittstelle von Mathematik, Informatik und Ingenieurwesen sowohl gründlich als auch anschaulich und nachvollziehbar dar.
The interplay of albumin (BSA) and lysozyme (LYZ) adsorbed simultaneously on titanium was analyzed by gel electrophoresis and BCA assay. It was found that BSA and lysozyme adsorb cooperatively. Additionally, the isoelectric point of the respective protein influences the adsorption. Also, the enzymatic activity of lysozyme and amylase (AMY) in mixtures with BSA was considered with respect to a possible influence of protein-protein interaction on enzyme activity. Indeed, an increase of lysozyme activity in the presence of BSA could be observed. In contrast, BSA does not influence the activity of amylase.
We present a new Min-Max theorem for an optimization problem closely connected to matchings and vertex covers in balanced hypergraphs. The result generalizes Kőnig’s Theorem (Berge and Las Vergnas in Ann N Y Acad Sci 175:32–40, 1970; Fulkerson et al. in Math Progr Study 1:120–132, 1974) and Hall’s Theorem (Conforti et al. in Combinatorica 16:325–329, 1996) for balanced hypergraphs.
There are different types of games that try to make use of the motivation of a gaming situation in learning contexts. This paper introduces the new terminology ‘Competence Developing Game’ (CDG) as an umbrella term for all games with this intention. Based on this new terminology, an assessment framework has been developed and validated in scope of an empirical study. Now, all different types of CDGs can be evaluated according to a defined and uniform set of assessment criteria and, thus, are comparable according to their characteristics and effectiveness.
A New Class of Biosensors Based on Tobacco Mosaic Virus and Coat Proteins as Enzyme Nanocarrier
(2016)
Purpose
Two semi-empirical models were recently published, both making use of existing literature data, but each taking into account different physical phenomena that trigger hemolysis. In the first model, hemoglobin (Hb) release is described as a permeation procedure across the membrane, assuming a shear stress-dependent process (sublethal model). The second model only accounts for hemoglobin release that is caused by cell membrane breakdown, which occurs when red blood cells (RBC) undergo mechanically induced shearing for a period longer than the threshold time (nonuniform threshold model). In this paper, we introduce a model that considers the hemolysis generated by both these possible phenomena.
Methods
Since hemolysis can possibly be caused by permeation of hemoglobin through the RBC functional membrane as well as by release of hemoglobin from RBC membrane breakdown, our proposed model combines both these models. An experimental setup consisting of a Couette device was utilized for validation of our proposed model.
Results
A comparison is presented between the damage index (DI) predicted by the proposed model vs. the sublethal model vs. the nonthreshold model and experimental datasets. This comparison covers a wide range of shear stress for both human and porcine blood. An appropriate agreement between the measured DI and the DI predicted by the present model was obtained.
Conclusions
The semiempirical hemolysis model introduced in this paper aims for significantly enhanced conformity with experimental data. Two phenomenological outcomes become possible with the proposed approach: an estimation of the average time after which cell membrane breakdown occurs under the applied conditions, and a prediction of the ratio between the phenomena involved in hemolysis.
Purpose
To calculate local specific absorption rate (SAR) correctly, both the amplitude and phase of the signal in each transmit channel have to be known. In this work, we propose a method to derive a conservative upper bound for the local SAR, with a reasonable safety margin without knowledge of the transmit phases of the channels.
Methods
The proposed method uses virtual observation points (VOPs). Correction factors are calculated for each set of VOPs that prevent underestimation of local SAR when the VOPs are applied with the correct amplitudes but fixed phases.
Results
The proposed method proved to be superior to the worst-case calculation based on the maximum eigenvalue of the VOPs. The mean overestimation for six coil setups could be reduced, whereas no underestimation of the maximum local SAR occurred. In the best investigated case, the overestimation could be reduced from a factor of 3.3 to a factor of 1.7.
Conclusion
The upper bound for the local SAR calculated with the proposed method allows a fast estimation of the local SAR based on power measurements in the transmit channels and facilitates SAR monitoring in systems that do not have the capability to monitor transmit phases
An equitable graph coloring is a proper vertex coloring of a graph G where the sizes of the color classes differ by at most one. The equitable chromatic number is the smallest number k such that G admits such equitable k-coloring. We focus on enumerative algorithms for the computation of the equitable coloring number and propose a general scheme to derive pruning rules for them: We show how the extendability of a partial coloring into an equitable coloring can be modeled via network flows. Thus, we obtain pruning rules which can be checked via flow algorithms. Computational experiments show that the search tree of enumerative algorithms can be significantly reduced in size by these rules and, in most instances, such naive approach even yields a faster algorithm. Moreover, the stability, i.e., the number of solved instances within a given time limit, is greatly improved.
Since the execution of flow algorithms at each node of a search tree is time consuming, we derive arithmetic pruning rules (generalized Hall-conditions) from the network model. Adding these rules to an enumerative algorithm yields an even larger runtime improvement.
Manufacturing process simulation enables the evaluation and improvement of autoclave mold concepts early in the design phase. To achieve a high part quality at low cycle times, the thermal behavior of the autoclave mold can be investigated by means of simulations. Most challenging for such a simulation is the generation of necessary boundary conditions. Heat-up and temperature distribution in an autoclave mold are governed by flow phenomena, tooling material and shape, position within the autoclave, and the chosen autoclave cycle. This paper identifies and summarizes the most important factors influencing mold heat-up and how they can be introduced into a thermal simulation. Thermal measurements are used to quantify the impact of the various parameters. Finally, the gained knowledge is applied to develop a semi-empirical approach for boundary condition estimation that enables a simple and fast thermal simulation of the autoclave curing process with reasonably high accuracy for tooling optimization.
Finding a good system topology with more than a handful of components is a
highly non-trivial task. The system needs to be able to fulfil all expected load cases, but at the
same time the components should interact in an energy-efficient way. An example for a system
design problem is the layout of the drinking water supply of a residential building. It may be
reasonable to choose a design of spatially distributed pumps which are connected by pipes in at
least two dimensions. This leads to a large variety of possible system topologies. To solve such
problems in a reasonable time frame, the nonlinear technical characteristics must be modelled
as simple as possible, while still achieving a sufficiently good representation of reality. The
aim of this paper is to compare the speed and reliability of a selection of leading mathematical
programming solvers on a set of varying model formulations. This gives us empirical evidence
on what combinations of model formulations and solver packages are the means of choice with the current state of the art.
Mice that have been genetically humanized for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging and promising in vivo models for an improved prediction of the pharmacokinetic, drug–drug interaction and safety characteristics of compounds in humans. The specific advantages and disadvantages of these models should be carefully considered when using them for studies in drug discovery and development. Here, an overview on the corresponding genetically humanized and chimeric liver humanized mouse models described to date is provided and illustrated with examples of their utility in drug metabolism and toxicity studies. We compare the strength and weaknesses of the two different approaches, give guidance for the selection of the appropriate model for various applications and discuss future trends and perspectives.
This summer, RoboCup competitions were held for the 20th time in Leipzig, Germany. It was the second time that RoboCup took place in Germany, 10 years after the 2006 RoboCup in Bremen. In this article, we give an overview on the latest developments of RoboCup and what happened in the different leagues over the last decade. With its 20th edition, RoboCup clearly is a success story and a role model for robotics competitions. From our personal view point, we acknowledge this by giving a retrospection about what makes RoboCup such a success.
20 Years of RoboCup
(2016)