Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1695)
- Fachbereich Elektrotechnik und Informationstechnik (719)
- IfB - Institut für Bioengineering (626)
- Fachbereich Energietechnik (589)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (553)
- Fachbereich Luft- und Raumfahrttechnik (497)
- Fachbereich Maschinenbau und Mechatronik (284)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4939) (remove)
Document Type
- Article (3288)
- Conference Proceeding (1171)
- Part of a Book (195)
- Book (146)
- Doctoral Thesis (32)
- Conference: Meeting Abstract (29)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
GHEtool is a Python package that contains all the functionalities needed to deal with borefield design. It is developed for both researchers and practitioners. The core of this package is the automated sizing of borefield under different conditions. The sizing of a borefield is typically slow due to the high complexity of the mathematical background. Because this tool has a lot of precalculated data, GHEtool can size a borefield in the order of tenths of milliseconds. This sizing typically takes the order of minutes. Therefore, this tool is suited for being implemented in typical workflows where iterations are required.
GHEtool also comes with a graphical user interface (GUI). This GUI is prebuilt as an exe-file because this provides access to all the functionalities without coding. A setup to install the GUI at the user-defined place is also implemented and available at: https://www.mech.kuleuven.be/en/tme/research/thermal_systems/tools/ghetool.
The scientific interest for near-Earth asteroids as well as the interest in potentially hazardous asteroids from the perspective of planetary defense led the space community to focus on near-Earth asteroid mission studies. A multiple near-Earth asteroid rendezvous mission with close-up observations of several objects can help to improve the characterization of these asteroids. This work explores the design of a solar-sail spacecraft for such a mission, focusing on the search of possible sequences of encounters and the trajectory optimization. This is done in two sequential steps: a sequence search by means of a simplified trajectory model and a set of heuristic rules based on astrodynamics, and a subsequent optimization phase. A shape-based approach for solar sailing has been developed and is used for the first phase. The effectiveness of the proposed approach is demonstrated through a fully optimized multiple near-Earth asteroid rendezvous mission. The results show that it is possible to visit five near-Earth asteroids within 10 years with near-term solar-sail technology.
The scientific interest in near-Earth asteroids (NEAs) and the classification of some of those as potentially hazardous asteroid for the Earth stipulated the interest in NEA exploration. Close-up observations of these objects will increase drastically our knowledge about the overall NEA population. For this reason, a multiple NEA rendezvous mission through solar sailing is investigated, taking advantage of the propellantless nature of this groundbreaking propulsion technology. Considering a spacecraft based on the DLR/ESA Gossamer technology, this work focuses on the search of possible sequences of NEA encounters. The effectiveness of this approach is demonstrated through a number of fully-optimized trajectories. The results show that it is possible to visit five NEAs within 10 years with near-term solar-sail technology. Moreover, a study on a reduced NEA database demonstrates the reliability of the approach used, showing that 58% of the sequences found with an approximated trajectory model can be converted into real solar-sail trajectories. Lastly, this second study shows the effectiveness of the proposed automatic optimization algorithm, which is able to find solutions for a large number of mission scenarios without any input required from the user.
In proton therapy, the dose from secondary neutrons to the patient can contribute to side effects and the creation of secondary cancer. A simple and fast detection system to distinguish between dose from protons and neutrons both in pretreatment verification as well as potentially in vivo monitoring is needed to minimize dose from secondary neutrons. Two 3 mm long, 1 mm diameter organic scintillators were tested for candidacy to be used in a proton–neutron discrimination detector. The SCSF-3HF (1500) scintillating fibre (Kuraray Co. Chiyoda-ku, Tokyo, Japan) and EJ-260 plastic scintillator (Eljen Technology, Sweetwater, TX, USA) were irradiated at the TRIUMF Neutron Facility and the Proton Therapy Research Centre. In the proton beam, we compared the raw Bragg peak and spread-out Bragg peak response to the industry standard Markus chamber detector. Both scintillator sensors exhibited quenching at high LET in the Bragg peak, presenting a peak-to-entrance ratio of 2.59 for the EJ-260 and 2.63 for the SCSF-3HF fibre, compared to 3.70 for the Markus chamber. The SCSF-3HF sensor demonstrated 1.3 times the sensitivity to protons and 3 times the sensitivity to neutrons as compared to the EJ-260 sensor. Combined with our equations relating neutron and proton contributions to dose during proton irradiations, and the application of Birks’ quenching correction, these fibres provide valid candidates for inexpensive and replicable proton-neutron discrimination detectors
Analysis of Big Data Streams to obtain Braking Reliability Information for Train Protection systems
(2017)
The first and last mile of a railway journey, in both freight and transit applications, constitutes a high effort and is either non-productive (e.g. in the case of depot operations) or highly inefficient (e.g. in industrial railways). These parts are typically managed on-sight, i.e. with no signalling and train protection systems ensuring the freedom of movement. This is possible due to the rather short braking distances of individual vehicles and shunting consists. The present article analyses the braking behaviour of such shunting units. For this purpose, a dedicated model is developed. It is calibrated on published results of brake tests and validated against a high-definition model for low-speed applications. Based on this model, multiple simulations are executed to obtain a Monte Carlo simulation of the resulting braking distances. Based on the distribution properties and established safety levels, the risk of exceeding certain braking distances is evaluated and maximum braking distances are derived. Together with certain parameters of the system, these can serve in the design and safety assessment of driver assistance systems and automation of these processes.
This study reviews the practice of brake tests in freight railways, which is time consuming and not suitable to detect certain failure types. Public incident reports are analysed to derive a reasonable brake test hardware and communication architecture, which aims to provide automatic brake tests at lower cost than current solutions. The proposed solutions relies exclusively on brake pipe and brake cylinder pressure sensors, a brake release position switch as well as radio communication via standard protocols. The approach is embedded in the Wagon 4.0 concept, which is a holistic approach to a smart freight wagon. The reduction of manual processes yields a strong incentive due to high savings in manual
labour and increased productivity.
Rare event simulation to optimise maintenance intervals of safety critical redundant subsystems
(2018)
Towards inclusion of the freight rail system in the industrial internet of things - Wagon 4.0
(2017)
In many instances, freight vehicles exchange load or information with plants that are or will soon be Industry4.0 plants. The Wagon4.0 concept, as developed in close cooperation with e.g. port or mine operations, offers a maximum in railway operational efficiency while providing strong business cases already in the respective plant interaction. The Wagon4.0 consists of main components, a power supply, data network, sensors, actuators and an operating system, the so called WagonOS. The Wagon OS is implemented in a granular, self-sufficient manner, to allow basic features such as WiFi-Mesh and train christening in remote areas without network connection. Furthermore, the granularity of the operating system allows to extend the familiar app concept to freight rail rolling stock, making it possible to use specialised actuators for certain applications, e.g. an electrical parking brake or an auxiliary drive. In order to facilitate migration to the Wagon4.0 for existing fleets, a migration concept featuring five levels of technical adaptation was developed. The present paper investigates the benefits of Wagon4.0-implementations for the particular challenges of heavy haul operations by focusing on train christening, ep-assisted braking, autonomous last mile and traction boost operation as well as improved maintenance schedules
This chapter describes three general strategies to master uncertainty in technical systems: robustness, flexibility and resilience. It builds on the previous chapters about methods to analyse and identify uncertainty and may rely on the availability of technologies for particular systems, such as active components. Robustness aims for the design of technical systems that are insensitive to anticipated uncertainties. Flexibility increases the ability of a system to work under different situations. Resilience extends this characteristic by requiring a given minimal functional performance, even after disturbances or failure of system components, and it may incorporate recovery. The three strategies are described and discussed in turn. Moreover, they are demonstrated on specific technical systems.
This paper develops a new finite element method (FEM)-based upper bound algorithm for limit and shakedown analysis of hardening structures by a direct plasticity method. The hardening model is a simple two-surface model of plasticity with a fixed bounding surface. The initial yield surface can translate inside the bounding surface, and it is bounded by one of the two equivalent conditions: (1) it always stays inside the bounding surface or (2) its centre cannot move outside the back-stress surface. The algorithm gives an effective tool to analyze the problems with a very high number of degree of freedom. Our numerical results are very close to the analytical solutions and numerical solutions in literature.
A research framework for human aspects in the internet of production: an intra-company perspective
(2020)
Digitalization in the production sector aims at transferring concepts and methods from the Internet of Things (IoT) to the industry and is, as a result, currently reshaping the production area. Besides technological progress, changes in work processes and organization are relevant for a successful implementation of the “Internet of Production” (IoP). Focusing on the labor organization and organizational procedures emphasizes to consider intra-company factors such as (user) acceptance, ethical issues, and ergonomics in the context of IoP approaches. In the scope of this paper, a research approach is presented that considers these aspects from an intra-company perspective by conducting studies on the shop floor, control level and management level of companies in the production area. Focused on four central dimensions—governance, organization, capabilities, and interfaces—this contribution presents a research framework that is focused on a systematic integration and consideration of human aspects in the realization of the IoP.
Image reconstruction analysis for positron emission tomography with heterostructured scintillators
(2022)
The concept of structure engineering has been proposed for exploring the next generation of radiation detectors with improved performance. A TOF-PET geometry with heterostructured scintillators with a pixel size of 3.0×3.1×15 mm3 was simulated using Monte Carlo. The heterostructures consisted of alternating layers of BGO as a dense material with high stopping power and plastic (EJ232) as a fast light emitter. The detector time resolution was calculated as a function of the deposited and shared energy in both materials on an event-by-event basis. While sensitivity was reduced to 32% for 100 μm thick plastic layers and 52% for 50 μm, the CTR distribution improved to 204±49 ps and 220±41 ps respectively, compared to 276 ps that we considered for bulk BGO. The complex distribution of timing resolutions was accounted for in the reconstruction. We divided the events into three groups based on their CTR and modeled them with different Gaussian TOF kernels. On a NEMA IQ phantom, the heterostructures had better contrast recovery in early iterations. On the other hand, BGO achieved a better contrast to noise ratio (CNR) after the 15th iteration due to the higher sensitivity. The developed simulation and reconstruction methods constitute new tools for evaluating different detector designs with complex time responses.
Anyone who has always wanted to understand the hieroglyphs on Sheldon's blackboard in the TV series The Big Bang Theory or who wanted to know exactly what the fate of Schrödinger's cat is all about will find a short, descriptive introduction to the world of quantum mechanics in this essential. The text particularly focuses on the mathematical description in the Hilbert space. The content goes beyond popular scientific presentations, but is nevertheless suitable for readers without special prior knowledge thanks to the clear examples.
Often, detailed simulations of heat conduction in complicated, porous media have large runtimes. Then homogenization is a powerful tool to speed up the calculations by preserving accurate solutions at the same time. Unfortunately real structures are generally non-periodic, which requires unpractical, complicated homogenization techniques. We demonstrate in this paper, that the application of simple, periodic techniques to realistic media, that are just close to periodic, gives accurate, approximative solutions. In order to obtain effective parameters for the homogenized heat equation, we have to solve a so called “cell problem”. In contrast to periodic structures it is not trivial to determine a suitable unit cell, which represents a non-periodic media. To overcome this problem, we give a rule of thumb on how to choose a good cell. Finally we demonstrate the efficiency of our method for virtually generated foams as well as real foams and compare these results to periodic structures.
Numerical solution of the heat equation with non-linear, time derivative-dependent source term
(2010)
The mathematical modeling of heat conduction with adsorption effects in coated metal structures yields the heat equation with piecewise smooth coefficients and a new kind of source term. This term is special, because it is non-linear and furthermore depends on a time derivative. In our approach we reformulated this as a new problem for the usual heat equation, without source term but with a new non-linear coefficient. We gave an existence and uniqueness proof for the weak solution of the reformulated problem. To obtain a numerical solution, we developed a semi-implicit and a fully implicit finite volume method. We compared these two methods theoretically as well as numerically. Finally, as practical application, we simulated the heat conduction in coated aluminum fibers with adsorption in the zeolite coating. Copyright © 2010 John Wiley & Sons, Ltd.
This paper describes two courses on
simulation methods for graduate students:
“Simulation Methods” and “Simulation and
Optimization in Virtual Engineering” The
courses were planned to teach young engineers
how to work with simulation software as well as
to understand the necessary mathematical background.
As simulation software COMSOL is
used. The main philosophy was to combine
theory and praxis in a way that motivates the
students. In addition “soft skills” should be
improved. This was achieved by project work as
final examination. As underlying didactical principle
the ideas of Bloom’s revised taxonomy
were followed. The paper basically focusses on
educational aspects, e.g. how to structure the
course, plan the exercises, organize the project
work and include practical COMSOL examples.
Surgical clip applicator
(1996)
The method of fundamental solutions is applied to the approximate computation of interior transmission eigenvalues for a special class of inhomogeneous media in two dimensions. We give a short approximation analysis accompanied with numerical results that clearly prove practical convenience of our alternative approach.
This paper investigates the interior transmission problem for homogeneous media via eigenvalue trajectories parameterized by the magnitude of the refractive index. In the case that the scatterer is the unit disk, we prove that there is a one-to-one correspondence between complex-valued interior transmission eigenvalue trajectories and Dirichlet eigenvalues of the Laplacian which turn out to be exactly the trajectorial limit points as the refractive index tends to infinity. For general simply-connected scatterers in two or three dimensions, a corresponding relation is still open, but further theoretical results and numerical studies indicate a similar connection.
IT Service Deployment
(2007)
IT Products are viewed and managed differently depending on the perspectives and the stage within the life cycle. A model is presented that integrates different perspectives and stages serving as an aid for the analysis of business models and focused positioning of IT-products. Four generic business models are analysed with regard to the product management function in general and the positioning field for IT-products specifically: off-the-shelf (license), license plus service, project, and system service (incl. cloud computing).
A Cooperative Work Environment for Evolutionary Software Development / Kurbel, K., Pietsch, W.
(1990)
A Portable Implementation of Index Sequential Input-Output [Part 1] / Kurbel, Karl; Pietsch, W.
(1986)
A Portable Implementation of Index Sequential Input-Output [Part 2] / Kurbel, Karl; Pietsch, W.
(1986)
Knowledge Management
(2001)
In this paper, methods of sample preparation for potentiometric measurement of phenylalanine are presented. Basing on the spectrophotometric measurements of phenylalanine, the concentrations of reagents of the enzymatic reaction (10 mM L-Phe, 0,4 mM NAD+, 2U L-PheDH) were determined. Then, the absorption spectrum of the reaction product, NADH, was monitored (maximum peak at 340 nm). The results obtained by the spectrophotometric method were compared with the results obtained by the colourimetry, using pH indicators. The above-mentioned two methods will be used as references for potentiometric measurements of phenylalanine concentration.
In this paper, methods of surface modification of different supports, i.e. glass and polymeric beads for enzyme immobilisation are described. The developed method of enzyme immobilisation is based on Schiff’s base formation between the amino groups on the enzyme surface and the aldehyde groups on the chemically modified surface of the supports. The surface of silicon modified by APTS and GOPS with immobilised enzyme was characterised by atomic force microscopy (AFM), time-of-flight secondary ion mass spectroscopy (ToF-SIMS) and infrared spectroscopy (FTIR). The supports with immobilised enzyme (urease) were also tested in combination with microreactors fabricated in silicon and Perspex, operating in a flow-through system. For microreactors filled with urease immobilised on glass beads (Sigma) and on polymeric beads (PAN), a very high and stable signal (pH change) was obtained. The developed method of urease immobilisation can be stated to be very effective.
An enzyme-based multi-parameter biosensor is developed for monitoring the concentration of formate, d-lactate, and l-lactate in biological samples. The sensor is based on the specific dehydrogenation by an oxidized β-nicotinamide adenine dinucleotide (NAD+)-dependent dehydrogenase (formate dehydrogenase, d-lactic dehydrogenase, and l-lactic dehydrogenase, respectively) in combination with a diaphorase from Clostridium kluyveri (EC 1.8.1.4). The enzymes are immobilized on a platinum working electrode by cross-linking with glutaraldehyde (GA). The principle of the determination scheme in case of l-lactate is as follows: l-lactic dehydrogenase (l-LDH) converts l-lactate into pyruvate by reaction with NAD+. In the presence of hexacyanoferrate(III), the resulting reduced β-nicotinamide adenine dinucleotide (NADH) is then regenerated enzymatically by diaphorase. The electrochemical detection is based on the current generated by oxidation of hexacyanoferrate(II) at an applied potential of +0.3 V vs. an Ag/AgCl reference electrode. The biosensor will be electrochemically characterized in terms of linear working range and sensitivity. Additionally, the successful practical application of the sensor is demonstrated in an extract from maize silage.
Multi-analyte biosensors may offer the opportunity to perform cost-effective and rapid analysis with reduced sample volume, as compared to electrochemical biosensing of each analyte individually. This work describes the development of an enzyme-based biosensor system for multi-parametric determination of four different organic acids. The biosensor array comprises five working electrodes for simultaneous sensing of ethanol, formate, d-lactate, and l-lactate, and an integrated counter electrode. Storage stability of the biosensor was evaluated under different conditions (stored at +4 °C in buffer solution and dry at −21 °C, +4 °C, and room temperature) over a period of 140 days. After repeated and regular application, the individual sensing electrodes exhibited the best stability when stored at −21 °C. Furthermore, measurements in silage samples (maize and sugarcane silage) were conducted with the portable biosensor system. Comparison with a conventional photometric technique demonstrated successful employment for rapid monitoring of complex media.
The immobilization of NAD+-dependent dehydrogenases, in combination with a diaphorase, enables the facile development of multiparametric sensing devices. In this work, an amperometric biosensor array for simultaneous determination of ethanol, formate, d- and l-lactate is presented. Enzyme immobilization on platinum thin-film electrodes was realized by chemical cross-linking with glutaraldehyde. The optimization of the sensor performance was investigated with regard to enzyme loading, glutaraldehyde concentration, pH, cofactor concentration and temperature. Under optimal working conditions (potassium phosphate buffer with pH 7.5, 2.5 mmol L-1 NAD+, 2.0 mmol L-1 ferricyanide, 25 °C and 0.4% glutaraldehyde) the linear working range and sensitivity of the four sensor elements was improved. Simultaneous and cross-talk free measurements of four different metabolic parameters were performed successfully. The reliable analytical performance of the biosensor array was demonstrated by application in a clarified sample of inoculum sludge. Thereby, a promising approach for on-site monitoring of fermentation processes is provided.
Attitude and Orbital Dynamics Modeling for an Uncontrolled Solar-Sail Experiment in Low-Earth Orbit
(2015)
Gossamer-1 is the first project of the three-step Gossamer roadmap, the purpose of which is to develop, prove and demonstrate that solar-sail technology is a safe and reliable propulsion technique for long-lasting and high-energy missions. This paper firstly presents the structural analysis performed on the sail to understand its elastic behavior. The results are then used in attitude and orbital simulations. The model considers the main forces and torques that a satellite experiences in low-Earth orbit coupled with the sail deformation. Doing the simulations for varying initial conditions in attitude and rotation rate, the results show initial states to avoid and maximum rotation rates reached for correct and faulty deployment of the sail. Lastly comparisons with the classic flat sail model are carried out to test the hypothesis that the elastic behavior does play a role in the attitude and orbital behavior of the sail
Optoelectronic Properties of Nanostructured Ensembles Controlled by Biomolecular Logic Systems
(2008)
A new and simple method for nanostructuring using conventional photolithography and layer expansion or pattern-size reduction technique is presented, which can further be applied for the fabrication of different nanostructures and nano-devices. The method is based on the conversion of a photolithographically patterned metal layer to a metal-oxide mask with improved pattern-size resolution using thermal oxidation. With this technique, the pattern size can be scaled down to several nanometer dimensions. The proposed method is experimentally demonstrated by preparing nanostructures with different configurations and layouts, like circles, rectangles, trapezoids, “fluidic-channel”-, “cantilever”- and meander-type structures.
Sensing charged macromolecules with nanocrystalline diamond-based field-effect capacitive sensors
(2008)
Novel concepts for flow-rate and flow-direction determination by means of pH-sensitive ISFETs
(2001)
Label-free Electrostatic Detection of DNA Amplification by PCR Using Capacitive Field-effect Devices
(2016)
A capacitive field-effect EIS (electrolyte-insulator-semiconductor) sensor modified with a positively charged weak polyelectrolyte of poly(allylamine hydrochloride) (PAH)/single-stranded probe DNA (ssDNA) bilayer has been used for a label-free electrostatic detection of pathogen-specific DNA amplification via polymerase chain reaction (PCR). The sensor is able to distinguish between positive and negative PCR solutions, to detect the existence of target DNA amplicons in PCR samples and thus, can be used as tool for a quick verification of DNA amplification and the successful PCR process.