Refine
Year of publication
- 2016 (158) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (43)
- Fachbereich Chemie und Biotechnologie (29)
- IfB - Institut für Bioengineering (29)
- Fachbereich Elektrotechnik und Informationstechnik (26)
- Fachbereich Luft- und Raumfahrttechnik (22)
- Fachbereich Bauingenieurwesen (13)
- INB - Institut für Nano- und Biotechnologien (13)
- Fachbereich Maschinenbau und Mechatronik (12)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (11)
- Fachbereich Energietechnik (10)
Language
- English (158) (remove)
Document Type
- Article (78)
- Conference Proceeding (66)
- Part of a Book (7)
- Conference: Meeting Abstract (3)
- Book (2)
- Doctoral Thesis (1)
- Report (1)
Keywords
- Technical Operations Research (2)
- Additive Manufacturing (1)
- Annulus Fibrosus (1)
- Assessment (1)
- Asymptotic efficiency (1)
- Bacillus atrophaeus (1)
- Balance (1)
- Balanced hypergraph (1)
- Building Systems (1)
- Business Simulations (1)
We present an effective and simple multiscale method for equilibrating Kremer Grest model polymer melts of varying stiffness. In our approach, we progressively equilibrate the melt structure above the tube scale, inside the tube and finally at the monomeric scale. We make use of models designed to be computationally effective at each scale. Density fluctuations in the melt structure above the tube scale are minimized through a Monte Carlo simulated annealing of a lattice polymer model. Subsequently the melt structure below the tube scale is equilibrated via the Rouse dynamics of a force-capped Kremer-Grest model that allows chains to partially interpenetrate. Finally the Kremer-Grest force field is introduced to freeze the topological state and enforce correct monomer packing. We generate 15 melts of 500 chains of 10.000 beads for varying chain stiffness as well as a number of melts with 1.000 chains of 15.000 monomers. To validate the equilibration process we study the time evolution of bulk, collective, and single-chain observables at the monomeric, mesoscopic, and macroscopic length scales. Extension of the present method to longer, branched, or polydisperse chains, and/or larger system sizes is straightforward.
Engineers are of particular importance for the societies of tomorrow. The big social challenges society has to cope with in future, can only be mastered, if engineers link the development and innovation process closely with the requirements of people. As a result, in the frame of the innovation process engineers have to design and develop products for diverse users. Therefore, the consideration of diversity in this process is a core competence engineers should have. Implementing the consideration of diverse requirements into product design is also linked to the development of sustainable products and thus leads to social responsible research and development, the core concept formulated by the EU.
For this reason, future engineers should be educated to look at the technical perspectives of a problem embedded in the related questions within societies they are developing their artefacts for. As a result, the aim of teaching engineering should be to prepare engineers for these requirements and to draw attention to the diverse needs in a globalized world.
To match the competence profiles of future engineers to the global challenges and the resulting social responsibility, RWTH Aachen University, one of the leading technical universities in Germany, has established the bridging professorship “Gender and Diversity in Engineering” (GDI) which educates engineers with an interdisciplinary approach to expand engineering limits. The interdisciplinary teaching concept of the research group pursues an approach which imparts an application oriented Gender and Diversity expertise to future engineers. In the frame of an established teaching concept, which is a result of experiences and expertise of the research group, students gain theoretical knowledge about Gender and Diversity and learn how to transfer their knowledge into their later field of action.
In the frame of the conference the institutional approach will be presented as well as the teaching concept which will be introduced by concrete course examples.
20 Years of RoboCup
(2016)
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.
Rubber materials filled with reinforcing fillers display nonlinear rheological behavior at small strain amplitudes below γ0 < 0.1. Nevertheless, rheological data are analyzed mostly in terms of linear parameters, such as shear moduli (G′, G″), which loose their physical meaning in the nonlinear regime. In this work styrene butadiene rubber filled with carbon black (CB) under large amplitude oscillatory shear (LAOS) is analyzed in terms of the nonlinear parameter I3/1. Three different CB grades are used and the filler load is varied between 0 and 70 phr. It is found that I3/1(φ) is most sensitive to changes of the total accessible filler surface area at low strain amplitudes (γ0 = 0.32). The addition of up to 70 phr CB leads to an increase of I3/1(φ) by a factor of more than ten. The influence of the measurement temperature on I3/1 is pronounced for CB levels above the percolation threshold.
This paper describes the development of a capacitively coupled high-pressure lamp with input power between 20 and 43 W at 2.45 GHz, using a coaxial line network. Compared with other electrodeless lamp systems, no cavity has to be used and a reduction in the input power is achieved. Therefore, this lamp is an alternative to the halogen incandescent lamp for domestic lighting. To serve the demands of domestic lighting, the filling of the lamp is optimized over all other resulting requirements, such as high efficacy at low induced powers and fast startups. A workflow to develop RF-driven plasma applications is presented, which makes use of the hot S-parameter technique. Descriptions of the fitting process inside a circuit and FEM simulator are given. Results of the combined ignition and operation network from simulations and measurements are compared. An initial prototype is built and measurements of the lamp's lighting properties are presented along with an investigation of the efficacy optimizations using large signal amplitude modulation. With this lamp, an efficacy of 135 lmW -1 is achieved.
Compared to peripheral pain, trigeminal pain elicits higher levels of fear, which is assumed to enhance the interruptive effects of pain on concomitant cognitive processes. In this fMRI study we examined the behavioral and neural effects of trigeminal (forehead) and peripheral (hand) pain on visual processing and memory encoding. Cerebral activity was measured in 23 healthy subjects performing a visual categorization task that was immediately followed by a surprise recognition task. During the categorization task subjects received concomitant noxious electrical stimulation on the forehead or hand. Our data show that fear ratings were significantly higher for trigeminal pain. Categorization and recognition performance did not differ between pictures that were presented with trigeminal and peripheral pain. However, object categorization in the presence of trigeminal pain was associated with stronger activity in task-relevant visual areas (lateral occipital complex, LOC), memory encoding areas (hippocampus and parahippocampus) and areas implicated in emotional processing (amygdala) compared to peripheral pain. Further, individual differences in neural activation between the trigeminal and the peripheral condition were positively related to differences in fear ratings between both conditions. Functional connectivity between amygdala and LOC was increased during trigeminal compared to peripheral painful stimulation. Fear-driven compensatory resource activation seems to be enhanced for trigeminal stimuli, presumably due to their exceptional biological relevance.
The potential of SMART climbing robot combined with a weatherproof cabin for rotor blade maintenance
(2016)
In this paper we present an extension of the action language Golog that allows for using fuzzy notions in non-deterministic argument choices and the reward function in decision-theoretic planning. Often, in decision-theoretic planning, it is cumbersome to specify the set of values to pick from in the non-deterministic-choice-of-argument statement. Also, even for domain experts, it is not always easy to specify a reward function. Instead of providing a finite domain for values in the non-deterministic-choice-of-argument statement in Golog, we now allow for stating the argument domain by simply providing a formula over linguistic terms and fuzzy uents. In Golog’s forward-search DT planning algorithm, these formulas are evaluated in order to find the agent’s optimal policy. We illustrate this in the Diner Domain where the agent needs to calculate the optimal serving order.
We present a new Min-Max theorem for an optimization problem closely connected to matchings and vertex covers in balanced hypergraphs. The result generalizes Kőnig’s Theorem (Berge and Las Vergnas in Ann N Y Acad Sci 175:32–40, 1970; Fulkerson et al. in Math Progr Study 1:120–132, 1974) and Hall’s Theorem (Conforti et al. in Combinatorica 16:325–329, 1996) for balanced hypergraphs.
Mice that have been genetically humanized for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging and promising in vivo models for an improved prediction of the pharmacokinetic, drug–drug interaction and safety characteristics of compounds in humans. The specific advantages and disadvantages of these models should be carefully considered when using them for studies in drug discovery and development. Here, an overview on the corresponding genetically humanized and chimeric liver humanized mouse models described to date is provided and illustrated with examples of their utility in drug metabolism and toxicity studies. We compare the strength and weaknesses of the two different approaches, give guidance for the selection of the appropriate model for various applications and discuss future trends and perspectives.
The main objective of the BATIMASS project was to address how the energy balance in relatively lightweight steel buildings can be improved by building in ‘active thermal mass’ (ATM) into the building fabric. This was achieved through concept design, dynamic thermal modelling and testing of a number of potentially viable systems and concepts. A significant programme of thermal simulation modelling was undertaken utilising the thermally equivalent slab (TES) concept to model the passive thermal capacity effect of profiled, composite metal floor decks. It is apparent from the modelling results that thermal mass is a highly complex phenomenon which is highly dependent upon building type, occupancy patterns, climate and many other aspects of the building design and servicing strategy. The ATM systems developed, both conceptually and for prototype testing, focussed on water-cooled composite slabs, the Cofradal floor system and the phase change material (PCM) Energain. In addition to laboratory testing of prototypes, whole building monitoring was undertaken at the Kubik building in Spain and the RWTH test building in Germany. Advanced thermal modelling was also undertaken to estimate the likely benefits of the ATM concept designs developed and for comparison with the test results. In addition to thermal testing, structural tests were conducted on composite floor specimens incorporating embedded water pipes. This Final Report presents the results of the activities carried out under this RFCS contract RFSR CT 2012 00033. The work carried out is reported in six major sections corresponding to the technical Work Packages of the project. Only summaries of the work carried out are provided in this report; all work undertaken is fully reported in the formal project deliverables.
The interplay of albumin (BSA) and lysozyme (LYZ) adsorbed simultaneously on titanium was analyzed by gel electrophoresis and BCA assay. It was found that BSA and lysozyme adsorb cooperatively. Additionally, the isoelectric point of the respective protein influences the adsorption. Also, the enzymatic activity of lysozyme and amylase (AMY) in mixtures with BSA was considered with respect to a possible influence of protein-protein interaction on enzyme activity. Indeed, an increase of lysozyme activity in the presence of BSA could be observed. In contrast, BSA does not influence the activity of amylase.
Evaluation of lignocellulosic material for butanol production using enzymatic hydrolysate medium
(2016)
Butanol is a promising gasoline additive and platform chemical that can be readily produced via acetone-butanolethanol (ABE) fermentation from pretreated lignocellulosic materials. This article examines lignocellulosic material from beech wood for ABE fermentation, using Clostridium acetobutylicum. First, the utilization of both C₅₋ (xylose) and C₆₋ (glucose) sugars as sole carbon source was investigated in static cultivation, using serum bottles and synthetic medium. The utilization of pentose sugar resulted in a solvent yield of 0.231 g·g_sugar⁻¹, compared to 0.262 g·g_sugar⁻¹ using hexose. Then, the Organosolv pretreated crude cellulose fibers (CF) were enzymatically decomposed, and the resulting hydrolysate medium was analyzed for inhibiting compounds (furans, organic acids, phenolics) and treated with ionexchangers for detoxification. Batch fermentation in a bioreactor using CF hydrolysate medium resulted in a total solvent yield of 0.20 gABE·g_sugar⁻¹.
Enzymatic hydrolysis of lignocellulosic material plays an important role in the classical biorefinery approach. Apart from the pretreatment of the raw material, hydrolysis is the basis for the conversion of the cellulose and hemicellulose fraction into fermentable sugars. After hydrolysis, usually a solid-liquid separation takes place, in order to separate the residual plant material from the sugar-rich fraction, which can be subsequently used in a fermentation step. In order to factor out the separation step, the usage of in alginate immobilized crude cellulose fiber beads (CFBs) were evaluated. Pretreated cellulose fibers are incorporated in an alginate matrix together with the relevant enzymes. In doing so, sugars diffuse trough the alginate matrix, allowing a simplified delivery into the surrounding fluid. This again reduces product inhibition of the glucose on the enzyme catalysts. By means of standardized bead production the hydrolysis in lab scale was possible. First results show that liberation of glucose and xylose is possible, allowing a maximum total sugar yield of 75 %.