Refine
Year of publication
- 2018 (165) (remove)
Document Type
- Article (83)
- Conference Proceeding (62)
- Part of a Book (14)
- Book (3)
- Doctoral Thesis (2)
- Working Paper (1)
Language
- English (165) (remove)
Has Fulltext
- no (165) (remove)
Keywords
- Energy efficiency (2)
- Engineering optimization (2)
- MINLP (2)
- Pump System (2)
- Serious Game (2)
- Water (2)
- Agility (1)
- Antarctica (1)
- Awareness (1)
- Bioethanol (1)
- Biorefinery (1)
- Bladder (1)
- Booster Stations (1)
- Buffering Capacity (1)
- CDG (1)
- Chance Constraint (1)
- Chemical imaging (1)
- Coat protein (1)
- Competence Developing Game (1)
- Coverage probability (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (58)
- Fachbereich Elektrotechnik und Informationstechnik (35)
- IfB - Institut für Bioengineering (34)
- INB - Institut für Nano- und Biotechnologien (23)
- Fachbereich Luft- und Raumfahrttechnik (18)
- Fachbereich Chemie und Biotechnologie (16)
- Fachbereich Energietechnik (15)
- Fachbereich Maschinenbau und Mechatronik (10)
- Fachbereich Bauingenieurwesen (8)
- Solar-Institut Jülich (4)
- ECSM European Center for Sustainable Mobility (3)
- Fachbereich Wirtschaftswissenschaften (2)
- Institut fuer Angewandte Polymerchemie (2)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (2)
- Nowum-Energy (2)
- Fachbereich Architektur (1)
- ZHQ - Bereich Hochschuldidaktik und Evaluation (1)
Vectrino profiler spatial filtering for shear flows based on the mean velocity gradient equation
(2018)
A new methodology is proposed to spatially filter acoustic Doppler velocimetry data from a Vectrino profiler based on the differential mean velocity equation. Lower and upper bounds are formulated in terms of physically based flow constraints. Practical implementation is discussed, and its application is tested against data gathered from an open-channel flow over a stepped macroroughness surface. The method has proven to detect outliers occurring all over the distance range sampled by the Vectrino profiler and has shown to remain applicable out of the region of validity of the velocity gradient equation. Finally, a statistical analysis suggests that physically obtained bounds are asymptotically representative.
In this study, flexible calorimetric gas sensors are developed for specificdetection of gaseous hydrogen peroxide (H₂O₂) over a wide concentrationrange, which is used in sterilization processes for aseptic packaging industry.The flexibility of these sensors is an advantage for identifying the chemical components of the sterilant on the corners of the food boxes, so-called “coldspots”, as critical locations in aseptic packaging, which are of great importance. These sensors are fabricated on flexible polyimide films by means of thin-film technique. Thin layers of titanium and platinum have been deposited on polyimide to define the conductive structures of the sensors. To detect the high-temperature evaporated H₂O₂, a differential temperature set-up is proposed. The sensors are evaluated in a laboratory-scaled sterilizationsystem to simulate the sterilization process. The concentration range of the evaporated H₂O₂ from 0 to 7.7% v/v was defined and the sensors have successfully detected high as well as low H₂O₂ concentrations with a sensitivity of 5.04 °C/% v/v. The characterizations of the sensors confirm their precise fabrication, high sensitivity and the novelty of low H₂O₂ concentration detections for future inline monitoring of food-package sterilization.
We propose a stochastic programming method to analyse limit and shakedown of structures under random strength with lognormal distribution. In this investigation a dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit or the shakedown limit. The edge-based smoothed finite element method (ES-FEM) using three-node linear triangular elements is used.
In the present work an optical sensor in combination with a spectrally resolved detection device for in-line particle-size-monitoring for quality control in beer production is presented. The principle relies on the size and wavelength dependent backscatter of growing particles in fluids. Measured interference structures of backscattered light are compared with calculated theoretical values, based on Mie-Theory, and fitted with a linear least square method to obtain particle size distributions. For this purpose, a broadband light source in combination with a process-CCD-spectrometer (charge ? coupled device spectrometer) and process adapted fiber optics are used. The goal is the development of an easy and flexible measurement device for in-line-monitoring of particle size. The presented device can be directly installed in product fill tubes or vessels, follows CIP- (cleaning in place) and removes the need of sample taking. A proof of concept and preliminary results, measuring protein precipitation, are presented.
For fuel flexibility enhancement hydrogen represents a possible alternative gas turbine fuel within future low emission power generation, in case of hydrogen production by the use of renewable energy sources such as wind energy or biomass. Kawasaki Heavy Industries, Ltd. (KHI) has research and development projects for future hydrogen society; production of hydrogen gas, refinement and liquefaction for transportation and storage, and utilization with gas turbine / gas engine for the generation of electricity. In the development of hydrogen gas turbines, a key technology is the stable and low NOx hydrogen combustion, especially Dry Low Emission (DLE) or Dry Low NOx (DLN) hydrogen combustion. Due to the large difference in the physical properties of hydrogen compared to other fuels such as natural gas, well established gas turbine combustion systems cannot be directly applied for DLE hydrogen combustion. Thus, the development of DLE hydrogen combustion technologies is an essential and challenging task for the future of hydrogen fueled gas turbines. The DLE Micro-Mix combustion principle for hydrogen fuel has been in development for many years to significantly reduce NOx emissions. This combustion principle is based on cross-flow mixing of air and gaseous hydrogen which reacts in multiple miniaturized “diffusion-type” flames. The major advantages of this combustion principle are the inherent safety against flashback and the low NOx-emissions due to a very short residence time of the reactants in the flame region of the micro-flames.
The Kremer-Grest (KG) bead-spring model is a near standard in Molecular Dynamic simulations of generic polymer properties. It owes its popularity to its computational efficiency, rather than its ability to represent specific polymer species and conditions. Here we investigate how to adapt the model to match the universal properties of a wide range of chemical polymers species. For this purpose we vary a single parameter originally introduced by Faller and Müller-Plathe, the chain stiffness. Examples include polystyrene, polyethylene, polypropylene, cis-polyisoprene, polydimethylsiloxane, polyethyleneoxide and styrene-butadiene rubber. We do this by matching the number of Kuhn segments per chain and the number of Kuhn segments per cubic Kuhn volume for the polymer species and for the Kremer-Grest model. We also derive mapping relations for converting KG model units back to physical units, in particular we obtain the entanglement time for the KG model as function of stiffness allowing for a time mapping. To test these relations, we generate large equilibrated well entangled polymer melts, and measure the entanglement moduli using a static primitive-path analysis of the entangled melt structure as well as by simulations of step-strain deformation of the model melts. The obtained moduli for our model polymer melts are in good agreement with the experimentally expected moduli.
Given industrial applications, the costs for the operation and maintenance of a pump system typically far exceed its purchase price. For finding an optimal pump configuration which minimizes not only investment, but life-cycle costs, methods like Technical Operations Research which is based on Mixed-Integer Programming can be applied. However, during the planning phase, the designer is often faced with uncertain input data, e.g. future load demands can only be estimated. In this work, we deal with this uncertainty by developing a chance-constrained two-stage (CCTS) stochastic program. The design and operation of a booster station working under uncertain load demand are optimized to minimize total cost including purchase price, operation cost incurred by energy consumption and penalty cost resulting from water shortage. We find optimized system layouts using a sample average approximation (SAA) algorithm, and analyze the results for different risk levels of water shortage. By adjusting the risk level, the costs and performance range of the system can be balanced, and thus the
system’s resilience can be engineered
Highly competitive markets paired with tremendous production volumes demand particularly cost efficient products. The usage of common parts and modules across product families can potentially reduce production costs. Yet, increasing commonality typically results in overdesign of individual products. Multi domain virtual prototyping enables designers to evaluate costs and technical feasibility of different single product designs at reasonable computational effort in early design phases. However, savings by platform commonality are hard to quantify and require detailed knowledge of e.g. the production process and the supply chain. Therefore, we present and evaluate a multi-objective metamodel-based optimization algorithm which enables designers to explore the trade-off between high commonality and cost optimal design of single products.
Sleep scoring is a necessary and time-consuming task in sleep studies. In animal models (such as mice) or in humans, automating this tedious process promises to facilitate long-term studies and to promote sleep biology as a data-driven f ield. We introduce a deep neural network model that is able to predict different states of consciousness (Wake, Non-REM, REM) in mice from EEG and EMG recordings with excellent scoring results for out-of-sample data. Predictions are made on epochs of 4 seconds length, and epochs are classified as artifactfree or not. The model architecture draws on recent advances in deep learning and in convolutional neural networks research. In contrast to previous approaches towards automated sleep scoring, our model does not rely on manually defined features of the data but learns predictive features automatically. We expect deep learning models like ours to become widely applied in different fields, automating many repetitive cognitive tasks that were previously difficult to tackle.
This paper presents NLP Lean Programming
framework (NLPf), a new framework
for creating custom natural language processing
(NLP) models and pipelines by utilizing
common software development build systems.
This approach allows developers to train and
integrate domain-specific NLP pipelines into
their applications seamlessly. Additionally,
NLPf provides an annotation tool which improves
the annotation process significantly by
providing a well-designed GUI and sophisticated
way of using input devices. Due to
NLPf’s properties developers and domain experts
are able to build domain-specific NLP
applications more efficiently. NLPf is Opensource
software and available at https://
gitlab.com/schrieveslaach/NLPf.
Seismic design of buried pipeline systems for energy and water supply is not only important for plant and operational safety but also for the maintenance of the supply infrastructure after an earthquake. The present paper shows special issues of the seismic wave impacts on buried pipelines, describes calculation methods, proposes approaches and gives calculation examples. This paper regards the effects of transient displacement differences and resulting tensions within the pipeline due to the wave propagation of the earthquake. However, the presented model can also be used to calculate fault rupture induced displacements. Based on a three-dimensional Finite Element Model parameter studies are performed to show the influence of several parameters such as incoming wave angle, wave velocity, backfill height and synthetic displacement time histories. The interaction between the pipeline and the surrounding soil is modeled with non-linear soil springs and the propagating wave is simulated affecting the pipeline punctually, independently in time and space. Special attention is given to long-distance heat pipeline systems. Here, in regular distances expansion bends are arranged to ensure movements of the pipeline due to high temperature. Such expansion bends are usually designed with small bending radii, which during the earthquake lead to high bending stresses in the cross-section of the pipeline. Finally, an interpretation of the results and recommendations are given for the most critical parameters.
Often, research results from collaboration projects are not transferred into productive environments even though approaches are proven to work in demonstration prototypes. These demonstration prototypes are usually too fragile and error-prone to be transferred
easily into productive environments. A lot of additional work is required.
Inspired by the idea of an incremental delivery process, we introduce an architecture pattern, which combines the approach of Metrics Driven Research Collaboration with microservices for the ease of integration. It enables keeping track of project goals over the course of the collaboration while every party may focus on their expert skills: researchers may focus on complex algorithms,
practitioners may focus on their business goals.
Through the simplified integration (intermediate) research results can be introduced into a productive environment which enables
getting an early user feedback and allows for the early evaluation of different approaches. The practitioners’ business model benefits throughout the full project duration.
In this work, we report on our attempt to design and implement an early introduction to basic robotics principles for children at kindergarten age. One of the main challenges of this effort is to explain complex robotics contents in a way that pre-school children could follow the basic principles and ideas using examples from their world of experience. What sets apart our effort from other work is that part of the lecturing is actually done by a robot itself and that a quiz at the end of the lesson is done using robots as well. The humanoid robot Pepper from Softbank, which is a great platform for human–robot interaction experiments, was used to present a lecture on robotics by reading out the contents to the children making use of its speech synthesis capability. A quiz in a Runaround-game-show style after the lecture activated the children to recap the contents they acquired about how mobile robots work in principle. In this quiz, two LEGO Mindstorm EV3 robots were used to implement a strongly interactive scenario. Besides the thrill of being exposed to a mobile robot that would also react to the children, they were very excited and at the same time very concentrated. We got very positive feedback from the children as well as from their educators. To the best of our knowledge, this is one of only few attempts to use a robot like Pepper not as a tele-teaching tool, but as the teacher itself in order to engage pre-school children with complex robotics contents.