Refine
Year of publication
Document Type
- Article (3149)
- Conference Proceeding (1016)
- Part of a Book (184)
- Book (144)
- Doctoral Thesis (30)
- Patent (25)
- Other (9)
- Report (9)
- Preprint (4)
- Poster (3)
- Talk (3)
- Master's Thesis (2)
- Working Paper (2)
- Bachelor Thesis (1)
- Contribution to a Periodical (1)
- Habilitation (1)
Language
- English (4583) (remove)
Has Fulltext
- no (4583) (remove)
Keywords
- Gamification (6)
- avalanche (6)
- Earthquake (5)
- Enterprise Architecture (5)
- MINLP (5)
- solar sail (5)
- Diversity Management (4)
- Energy storage (4)
- Engineering optimization (4)
- LAPS (4)
- Natural language processing (4)
- Papierkunst (4)
- Power plants (4)
- Seismic design (4)
- field-effect sensor (4)
- frequency mixing magnetic detection (4)
- hydrogen (4)
- metal structure (4)
- snow (4)
- steel (4)
Institute
- Fachbereich Medizintechnik und Technomathematik (1545)
- Fachbereich Elektrotechnik und Informationstechnik (686)
- IfB - Institut für Bioengineering (560)
- Fachbereich Energietechnik (552)
- INB - Institut für Nano- und Biotechnologien (532)
- Fachbereich Chemie und Biotechnologie (522)
- Fachbereich Luft- und Raumfahrttechnik (463)
- Fachbereich Maschinenbau und Mechatronik (261)
- Fachbereich Wirtschaftswissenschaften (196)
- Solar-Institut Jülich (160)
- Fachbereich Bauingenieurwesen (146)
- ECSM European Center for Sustainable Mobility (75)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (62)
- Fachbereich Gestaltung (24)
- Nowum-Energy (24)
- Institut fuer Angewandte Polymerchemie (23)
- Sonstiges (21)
- Fachbereich Architektur (20)
- Freshman Institute (18)
- Kommission für Forschung und Entwicklung (18)
Useful market simulations are key to the evaluation of diferent market designs existing of multiple market mechanisms or rules. Yet a simulation framework which has a comparison of diferent market mechanisms in mind was not found. The need to create an objective view on different sets of market rules while investigating meaningful agent strategies concludes that such a simulation framework is needed to advance the research on this subject. An overview of diferent existing market simulation models is given which also shows the research gap and the missing capabilities of those systems. Finally, a methodology is outlined how a novel market simulation which can answer the research questions can be developed.
Light-addressable potentiometric sensors (LAPS) are semiconductor-based potentiometric sensors, with the advantage to detect the concentration of a chemical species in a liquid solution above the sensor surface in a spatially resolved manner. The addressing is achieved by a modulated and focused light source illuminating the semiconductor and generating a concentration-depending photocurrent. This work introduces a LAPS set-up that is able to monitor the electrical impedance in addition to the photocurrent. The impedance spectra of a LAPS structure, with and without illumination, as well as the frequency behaviour of the LAPS measurement are investigated. The measurements are supported by electrical equivalent circuits to explain the impedance and the LAPS-frequency behaviour. The work investigates the influence of different parameters on the frequency behaviour of the LAPS. Furthermore, the phase shift of the photocurrent, the influence of the surface potential as well as the changes of the sensor impedance will be discussed.
Frequency Dependent Impedance Analysis of the Foundation-Soil-Systems of Onshore Wind Turbines
(2018)
Frequency mixing magnetic detection (FMMD) has been explored for its applications in fields of magnetic biosensing, multiplex detection of magnetic nanoparticles (MNP) and the determination of core size distribution of MNP samples. Such applications rely on the application of a static offset magnetic field, which is generated traditionally with an electromagnet. Such a setup requires a current source, as well as passive or active cooling strategies, which directly sets a limitation based on the portability aspect that is desired for point of care (POC) monitoring applications. In this work, a measurement head is introduced that involves the utilization of two ring-shaped permanent magnets to generate a static offset magnetic field. A steel cylinder in the ring bores homogenizes the field. By variation of the distance between the ring magnets and of the thickness of the steel cylinder, the magnitude of the magnetic field at the sample position can be adjusted. Furthermore, the measurement setup is compared to the electromagnet offset module based on measured signals and temperature behavior.
We consider recent reports on small-world topologies of interaction networks derived from the dynamics of spatially extended systems that are investigated in diverse scientific fields such as neurosciences, geophysics, or meteorology. With numerical simulations that mimic typical experimental situations, we have identified an important constraint when characterizing such networks: indications of a small-world topology can be expected solely due to the spatial sampling of the system along with the commonly used time series analysis based approaches to network characterization.
Messenger apps like WhatsApp and Telegram are frequently used for everyday communication, but they can also be utilized as a platform for illegal activity. Telegram allows public groups with up to 200.000 participants. Criminals use these public groups for trading illegal commodities and services, which becomes a concern for law enforcement agencies, who manually monitor suspicious activity in these chat rooms. This research demonstrates how natural language processing (NLP) can assist in analyzing these chat rooms, providing an explorative overview of the domain and facilitating purposeful analyses of user behavior. We provide a publicly available corpus of annotated text messages with entities and relations from four self-proclaimed black market chat rooms. Our pipeline approach aggregates the extracted product attributes from user messages to profiles and uses these with their sold products as features for clustering. The extracted structured information is the foundation for further data exploration, such as identifying the top vendors or fine-granular price analyses. Our evaluation shows that pretrained word vectors perform better for unsupervised clustering than state-of-the-art transformer models, while the latter is still superior for sequence labeling.
We study the novel possibilities computer aided design and production open up for the design of building systems. Such systems today can, via individualized mass production, consist of a larger number and more complex parts than previously and therefore be assembled into more complex wholes. This opens up the possibility of designing specialized systems specifically for single buildings. The common order of starting with a building system and designing a building using this system can be reversed to designing a building first and then developing a system specifically for that building. We present and discuss research that incorporates students design projects into research work and fosters links between research and teaching.
The concept of an injective affine embedding of the quantum states into a set of classical states, i.e., into the set of the probability measures on some measurable space, as well as its relation to statistically complete observables is revisited, and its limitation in view of a classical reformulation of the statistical scheme of quantum mechanics is discussed. In particular, on the basis of a theorem concerning a non-denseness property of a set of coexistent effects, it is shown that an injective classical embedding of the quantum states cannot be supplemented by an at least approximate classical description of the quantum mechanical effects. As an alternative approach, the concept of quasi-probability representations of quantum mechanics is considered.
The network approach towards the analysis of the dynamics of complex systems has been successfully applied in a multitude of studies in the neurosciences and has yielded fascinating insights. With this approach, a complex system is considered to be composed of different constituents which interact with each other. Interaction structures can be compactly represented in interaction networks. In this contribution, we present a brief overview about how interaction networks are derived from multivariate time series, about basic network characteristics, and about challenges associated with this analysis approach.
Researching the field of business intelligence and analytics (BI & A) has a long tradition within information systems research. Thereby, in each decade the rapid development of technologies opened new room for investigation. Since the early 1950s, the collection and analysis of structured data were the focus of interest, followed by unstructured data since the early 1990s. The third wave of BI & A comprises unstructured and sensor data of mobile devices. The article at hand aims at drawing a comprehensive overview of the status quo in relevant BI & A research of the current decade, focusing on the third wave of BI & A. By this means, the paper’s contribution is fourfold. First, a systematically developed taxonomy for BI & A 3.0 research, containing seven dimensions and 40 characteristics, is presented. Second, the results of a structured literature review containing 75 full research papers are analyzed by applying the developed taxonomy. The analysis provides an overview on the status quo of BI & A 3.0. Third, the results foster discussions on the predicted and observed developments in BI & A research of the past decade. Fourth, research gaps of the third wave of BI & A research are disclosed and concluded in a research agenda.
Mechanical forces/tensile stresses are critical determinants of cellular growth, differentiation and migration patterns in health and disease. The innovative “CellDrum technology” was designed for measuring mechanical tensile stress of cultured cell monolayers/thin tissue constructs routinely. These are cultivated on very thin silicone membranes in the so-called CellDrum. The cell layers adhere firmly to the membrane and thus transmit the cell forces generated. A CellDrum consists of a cylinder which is sealed from below with a 4 μm thick, biocompatible, functionalized silicone membrane. The weight of cell culture medium bulbs the membrane out downwards. Membrane indentation is measured. When cells contract due to drug action, membrane, cells and medium are lifted upwards. The induced indentation changes allow for lateral drug induced mechanical tension quantification of the micro-tissues. With hiPS-induced (human) Cardiomyocytes (CM) the CellDrum opens new perspectives of individualized cardiac drug testing. Here, monolayers of self-beating hiPS-CMs were grown in CellDrums. Rhythmic contractions of the hiPS-cells induce membrane up-and-down deflections. The recorded cycles allow for single beat amplitude, single beat duration, integration of the single beat amplitude over the beat time and frequency analysis. Dose effects of agonists and antagonists acting on Ca2+ channels were sensitively and highly reproducibly observed. Data were consistent with published reference data as far as they were available. The combination of the CellDrum technology with hiPS-Cardiomyocytes offers a fast, facile and precise system for pharmacological and toxicological studies. It allows new preclinical basic as well as applied research in pharmacolgy and toxicology.
The treatment of septic wounds with curative dressings based on biocomposites containing sage and marigold phytoextracts was effective in in vitro and in vivo experiments. These dressings caused the purification of the wound surface from purulent-necrotic masses three days earlier than in the other experimental groups. The consequence of an increase in incidents of severe course of the wound and the observed tendency to increase the number of adverse effects is the development of long-term recurrent wound processes. To treat purulent wounds, the following tactics were used: The purulent wounds of animals were covered with the examined wound dressing, and then the next day samples were taken, the procedure was performed once in 2 days. To obtain the active nanostructured sorbents such as carbonized rice husks, they are functionalized with biologically active components possessing antimicrobial, anti-inflammatory, antitoxic, immunomodulating, antiallergic and other types of properties.
Biotechnological downstream processing is usually an elaborate procedure, requiring a multitude of unit operations to isolate the target component. Besides the disadvantageous space-time yield, the risks of cross-contaminations and product loss grow fast with the complexity of the isolation procedure. A significant reduction of unit operations can be achieved by application of magnetic particles, especially if these are functionalized with affinity ligands. As magnetic susceptible materials are highly uncommon in biotechnological processes, target binding and selective separation of such particles from fermentation or reactions broths can be done in a single step. Since the magnetizable particles can be produced from iron salts and low priced polymers, a single-use implementation of these systems is highly conceivable. In this article, the principles of magnetizable particles, their synthesis and functionalization are explained. Furthermore, applications in the area of reaction engineering, microfluidics and downstream processing are discussed focusing on established single-use technologies and development potential.
Fundamentals and ignition of a microplasma at 2.45 GHZ / Holtrup, Stephan ; Heuermann, Holger
(2009)
Different analytical approaches exist to describe the structural substance or wear reserve of sewer systems. The aim is to convert engineering assessments of often complex defect patterns into computational algorithms and determine a substance class for a sewer section or manhole. This analytically determined information is essential for strategic rehabilitation planning processes up to network level, as it corresponds to the most appropriate rehabilitation type and can thus provide decision-making support. Current calculation methods differ clearly from each other in parts, so that substance classes determined by the different approaches are only partially comparable with each other. The objective of the German R&D cooperation project ‘SubKanS’ is to develop a methodology for classifying the specific defect patterns resulting from the interaction of all the individual defects, and their severities and locations. The methodology takes into account the structural substance of sewer sections and manholes, based on real data and theoretical considerations analogous to the condition classification of individual defects. The result is a catalogue of defect patterns and characteristics, as well as associated structural substance classifications of sewer systems (substance classes). The methodology for sewer system substance classification is developed so that the classification of individual defects can be transferred into a substance class of the sewer section or manhole, eventually taking into account further information (e.g. pipe material, nominal diameter, etc.). The result is a validated methodology for automated sewer system substance classification.
Future evolution of risk management for structures : Advancement for the future IEC 62305-2 Ed3
(2011)
A Gamified Information System (GIS) implements game concepts and elements, such as affordances and game design principles to motivate people. Based on the idea to develop a GIS to increase the motivation of software developers to perform software quality tasks, the research work at hand aims at investigating relevant requirements from that target group. Therefore, 14 interviews with software development experts are conducted and analyzed. According to the results, software developers prefer the affordances points, narrative storytelling in a multiplayer and a round-based setting. Furthermore, six design principles for the development of a GIS are derived.
Virtual Reality (VR) offers novel possibilities for remote training regardless of the availability of the actual equipment, the presence of specialists, and the training locations. Research shows that training environments that adapt to users' preferences and performance can promote more effective learning. However, the observed results can hardly be traced back to specific adaptive measures but the whole new training approach. This study analyzes the effects of a combined point and leveling VR-based gamification system on assembly training targeting specific training outcomes and users' motivations. The Gamified-VR-Group with 26 subjects received the gamified training, and the Non-Gamified-VR-Group with 27 subjects received the alternative without gamified elements. Both groups conducted their VR training at least three times before assembling the actual structure. The study found that a level system that gradually increases the difficulty and error probability in VR can significantly lower real-world error rates, self-corrections, and support usages. According to our study, a high error occurrence at the highest training level reduced the Gamified-VR-Group's feeling of competence compared to the Non-Gamified-VR-Group, but at the same time also led to lower error probabilities in real-life. It is concluded that a level system with a variable task difficulty should be combined with carefully balanced positive and negative feedback messages. This way, better learning results, and an improved self-evaluation can be achieved while not causing significant impacts on the participants' feeling of competence.
Gas sensor investigation based on a catalytically activated thin-film thermopile for H2O2 detection
(2010)
Rehabilitative body weight supported gait training aims at restoring walking function as a key element in activities of daily living. Studies demonstrated reductions in muscle and joint forces, while kinematic gait patterns appear to be preserved with up to 30% weight support. However, the influence of body weight support on muscle architecture, with respect to fascicle and series elastic element behavior is unknown, despite this having potential clinical implications for gait retraining. Eight males (31.9 ± 4.7 years) walked at 75% of the speed at which they typically transition to running, with 0% and 30% body weight support on a lower-body positive pressure treadmill. Gastrocnemius medialis fascicle lengths and pennation angles were measured via ultrasonography. Additionally, joint kinematics were analyzed to determine gastrocnemius medialis muscle–tendon unit lengths, consisting of the muscle's contractile and series elastic elements. Series elastic element length was assessed using a muscle–tendon unit model. Depending on whether data were normally distributed, a paired t-test or Wilcoxon signed rank test was performed to determine if body weight supported walking had any effects on joint kinematics and fascicle–series elastic element behavior. Walking with 30% body weight support had no statistically significant effect on joint kinematics and peak series elastic element length. Furthermore, at the time when peak series elastic element length was achieved, and on average across the entire stance phase, muscle–tendon unit length, fascicle length, pennation angle, and fascicle velocity were unchanged with respect to body weight support. In accordance with unchanged gait kinematics, preservation of fascicle–series elastic element behavior was observed during walking with 30% body weight support, which suggests transferability of gait patterns to subsequent unsupported walking.
With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.
Breast cancer resistance protein (BCRP) is expressed in various tissues, such as the gut, liver, kidney and blood brain barrier (BBB), where it mediates the unidirectional transport of substrates to the apical/luminal side of polarized cells. Thereby BCRP acts as an efflux pump, mediating the elimination or restricting the entry of endogenous compounds or xenobiotics into tissues and it plays important roles in drug disposition, efficacy and safety. Bcrp knockout mice (Bcrp−/−) have been used widely to study the role of this transporter in limiting intestinal absorption and brain penetration of substrate compounds. Here we describe the first generation and characterization of a mouse line humanized for BCRP (hBCRP), in which the mouse coding sequence from the start to stop codon was replaced with the corresponding human genomic region, such that the human transporter is expressed under control of the murine Bcrp promoter. We demonstrate robust human and loss of mouse BCRP/Bcrp mRNA and protein expression in the hBCRP mice and the absence of major compensatory changes in the expression of other genes involved in drug metabolism and disposition. Pharmacokinetic and brain distribution studies with several BCRP probe substrates confirmed the functional activity of the human transporter in these mice. Furthermore, we provide practical examples for the use of hBCRP mice to study drug-drug interactions (DDIs). The hBCRP mouse is a promising model to study the in vivo role of human BCRP in limiting absorption and BBB penetration of substrate compounds and to investigate clinically relevant DDIs involving BCRP.
Generation and Characterization of a Novel Multidrug Resistance Protein 2 Humanized Mouse Line
(2012)
The multidrug resistance protein (MRP) 2 is predominantly expressed in liver, intestine, and kidney, where it plays an important role in the excretion of a range of drugs and their metabolites or endogenous compounds into bile, feces, and urine. Mrp knockout [Mrp2(−/−)] mice have been used recently to study the role of MRP2 in drug disposition. Here, we describe the first generation and initial characterization of a mouse line humanized for MRP2 (huMRP2), which is nulled for the mouse Mrp2 gene and expresses the human transporter in the organs and cell types where MRP2 is normally expressed. Analysis of the mRNA expression for selected cytochrome P450 and transporter genes revealed no major changes in huMRP2 mice compared with wild-type controls. We show that human MRP2 is able to compensate functionally for the loss of the mouse transporter as demonstrated by comparable bilirubin levels in the humanized mice and wild-type controls, in contrast to the hyperbilirubinemia phenotype that is observed in MRP2(−/−) mice. The huMRP2 mouse provides a model to study the role of the human transporter in drug disposition and in assessing the in vivo consequences of inhibiting this transporter by compounds interacting with human MRP2.
Compared with rodents and many other animal species, the human cytochrome P450 (P450) Cyp2c gene cluster varies significantly in the multiplicity of functional genes and in the substrate specificity of its enzymes. As a consequence, the use of wild-type animal models to predict the role of human CYP2C enzymes in drug metabolism and drug-drug interactions is limited. Within the human CYP2C cluster CYP2C9 is of particular importance, because it is one of the most abundant P450 enzymes in human liver, and it is involved in the metabolism of a wide variety of important drugs and environmental chemicals. To investigate the in vivo functions of cytochrome P450 Cyp2c genes and to establish a model for studying the functions of CYP2C9 in vivo, we have generated a mouse model with a deletion of the murine Cyp2c gene cluster and a corresponding humanized model expressing CYP2C9 specifically in the liver. Despite the high number of functional genes in the mouse Cyp2c cluster and the reported roles of some of these proteins in different biological processes, mice deleted for Cyp2c genes were viable and fertile but showed certain phenotypic alterations in the liver. The expression of CYP2C9 in the liver also resulted in viable animals active in the metabolism and disposition of a number of CYP2C9 substrates. These mouse lines provide a powerful tool for studying the role of Cyp2c genes and of CYP2C9 in particular in drug disposition and as a factor in drug-drug interaction.
Disruption experiments targeted at the Bacillus licheniformis degSU operon and GFP-reporter analysis provided evidence for promoter activity immediately upstream of degU. pMutin mediated concomitant introduction of the degU32 allele – known to cause hypersecretion in Bacillus subtilis – resulted in a marked increase in protease activity. Application of 5-fluorouracil based counterselection through establishment of a phosphoribosyltransferase deficient Δupp strain eventually facilitated the marker-free introduction of degU32 leading to further protease enhancement achieving levels as for hypersecreting wild strains in which degU was overexpressed. Surprisingly, deletion of rapG – known to interfere with DegU DNA-binding in B. subtilis – did not enhance protease production neither in the wild type nor in the degU32 strain. The combination of degU32 and Δupp counterselection in the type strain is not only equally effective as in hypersecreting wild strains with respect to protease production but furthermore facilitates genetic strain improvement aiming at biological containment and effectiveness of biotechnological processes.
Genetically humanized mice for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging as promising in vivo models for improved prediction of the pharmacokinetic, drug–drug interaction, and safety characteristics of compounds in humans. This is an overview on the genetically humanized and chimeric liver-humanized mouse models, which are illustrated with examples of their utility in drug metabolism and toxicity studies. The models are compared to give guidance for selection of the most appropriate model by highlighting advantages and disadvantages to be carefully considered when used for studies in drug discovery and development.
1. Drug metabolizing enzymes and transporters play important roles in the absorption, metabolism, tissue distribution and excretion of various compounds and their metabolites and thus can significantly affect their efficacy and safety. Furthermore, they can be involved in drug–drug interactions which can result in adverse responses, life-threatening toxicity or impaired efficacy. Significant species differences in the interaction of compounds with drug metabolizing enzymes and transporters have been described.
2. In order to overcome the limitation of animal models in accurately predicting human responses, a large variety of mouse models humanized for drug metabolizing enzymes and to a lesser extent drug transporters have been created.
3. This review summarizes the literature describing these mouse models and their key applications in studying the role of drug metabolizing enzymes and transporters in drug bioavailability, tissue distribution, clearance and drug–drug interactions as well as in human metabolite testing and risk assessment.
4. Though such humanized mouse models have certain limitations, there is great potential for their use in basic research and for testing and development of new medicines. These limitations and future potentials will be discussed.
GHEtool is a Python package that contains all the functionalities needed to deal with borefield design. It is developed for both researchers and practitioners. The core of this package is the automated sizing of borefield under different conditions. The sizing of a borefield is typically slow due to the high complexity of the mathematical background. Because this tool has a lot of precalculated data, GHEtool can size a borefield in the order of tenths of milliseconds. This sizing typically takes the order of minutes. Therefore, this tool is suited for being implemented in typical workflows where iterations are required.
GHEtool also comes with a graphical user interface (GUI). This GUI is prebuilt as an exe-file because this provides access to all the functionalities without coding. A setup to install the GUI at the user-defined place is also implemented and available at: https://www.mech.kuleuven.be/en/tme/research/thermal_systems/tools/ghetool.
To train end users how to interact with digital systems is indispensable to ensure a strong computer security. 'Competence Developing Game'-based approaches are particularly suitable for this purpose because of their motivation-and simulation-aspects. In this paper the Competence Developing Game 'GHOST' for cybersecurity awareness trainings and its underlying patterns are described. Accordingly, requirements for an 'Competence Developing Game' based training are discussed. Based on these requirements it is shown how a game can fulfill these requirements. A supplementary game interaction design and a corresponding evaluation study is shown. The combination of training requirements and interaction design is used to create a 'Competence Developing Game'-based training concept. A part of these concept is implemented into a playable prototype that serves around one hour of play respectively training time. This prototype is used to perform an evaluation of the game and training aspects of the awareness training. Thereby, the quality of the game aspect and the effectiveness of the training aspect are shown.
Searching optimal continuous-thrust trajectories is usually a difficult and time-consuming task. The solution quality of traditional optimal-control methods depends strongly on an adequate initial guess because the solution is typically close to the initial guess, which may be far from the (unknown) global optimum. Evolutionary neurocontrol attacks continuous-thrust optimization problems from the perspective of artificial intelligence and machine learning, combining artificial neural networks and evolutionary algorithms. This chapter describes the method and shows some example results for single- and multi-phase continuous-thrust trajectory optimization problems to assess its performance. Evolutionary neurocontrol can explore the trajectory search space more exhaustively than a human expert can do with traditional optimal-control methods. Especially for difficult problems, it usually finds solutions that are closer to the global optimum. Another fundamental advantage is that continuous-thrust trajectories can be optimized without an initial guess and without expert supervision.
Low-thrust space propulsion systems enable flexible high-energy deep space missions, but the design and optimization of the interplanetary transfer trajectory is usually difficult. It involves much experience and expert knowledge because the convergence behavior of traditional local trajectory optimization methods depends strongly on an adequate initial guess. Within this extended abstract, evolutionary neurocontrol, a method that fuses artificial neural networks and evolutionary algorithms, is proposed as a smart global method for low-thrust trajectory optimization. It does not require an initial guess. The implementation of evolutionary neurocontrol is detailed and its performance is shown for an exemplary mission.
Goal Driven Business Modelling - Supporting Decision Making within Information System Development
(1995)
The coupling of ligand-stabilized gold nanoparticles with field-effect devices offers new possibilities for label-free biosensing. In this work, we study the immobilization of aminooctanethiol-stabilized gold nanoparticles (AuAOTs) on the silicon dioxide surface of a capacitive field-effect sensor. The terminal amino group of the AuAOT is well suited for the functionalization with biomolecules. The attachment of the positively-charged AuAOTs on a capacitive field-effect sensor was detected by direct electrical readout using capacitance-voltage and constant capacitance measurements. With a higher particle density on the sensor surface, the measured signal change was correspondingly more pronounced. The results demonstrate the ability of capacitive field-effect sensors for the non-destructive quantitative validation of nanoparticle immobilization. In addition, the electrostatic binding of the polyanion polystyrene sulfonate to the AuAOT-modified sensor surface was studied as a model system for the label-free detection of charged macromolecules. Most likely, this approach can be transferred to the label-free detection of other charged molecules such as enzymes or antibodies.
A technology reference study for a multiple near-Earth object (NEO) rendezvous mission with solar sailcraft is currently carried out by the authors of this paper. The investigated mission builds on previous concepts, but adopts a strong micro-spacecraft philosophy based on the DLR/ESA Gossamer technology. The main scientific objective of the mission is to explore the diversity of NEOs. After direct interplanetary insertion, the solar sailcraft should—within less than 10 years—rendezvous three NEOs that are not only scientifically interesting, but also from the point of human spaceight and planetary defense. In this paper, the objectives of the study are outlined and a preliminary potential mission profile is presented.
A technology reference study for a solar polar mission is presented. The study uses novel analytical methods to quantify the mission design space including the required sail performance to achieve a given solar polar observation angle within a given timeframe and thus to derive mass allocations for the remaining spacecraft sub-systems, that is excluding the solar sail sub-system. A parametric, bottom-up, system mass budget analysis is then used to establish the required sail technology to deliver a range of science payloads, and to establish where such payloads can be delivered to within a given timeframe. It is found that a solar polar mission requires a solar sail of side-length 100–125 m to deliver a ‘sufficient value’ minimum science payload, and that a 2.5 μm sail film substrate is typically required, however the design is much less sensitive to the boom specific mass.
A technology reference study for a displaced Lagrange point space weather mission is presented. The mission builds on previous concepts, but adopts a strong micro-spacecraft philosophy to deliver a low mass platform and payload which can be accommodated on the DLR/ESA Gossamer-3 technology demonstration mission. A direct escape from Geostationary Transfer Orbit is assumed with the sail deployed after the escape burn. The use of a miniaturized, low mass platform and payload then allows the Gossamer-3 solar sail to potentially double the warning time of space weather events. The mission profile and mass budgets will be presented to achieve these ambitious goals.
One central challenge for self-driving cars is a proper path-planning. Once a trajectory has been found, the next challenge is to accurately and safely follow the precalculated path. The model-predictive controller (MPC) is a common approach for the lateral control of autonomous vehicles. The MPC uses a vehicle dynamics model to predict the future states of the vehicle for a given prediction horizon. However, in order to achieve real-time path control, the computational load is usually large, which leads to short prediction horizons. To deal with the computational load, the control algorithm can be parallelized on the graphics processing unit (GPU). In contrast to the widely used stochastic methods, in this paper we propose a deterministic approach based on grid search. Our approach focuses on systematically discovering the search area with different levels of granularity. To achieve this, we split the optimization algorithm into multiple iterations. The best sequence of each iteration is then used as an initial solution to the next iteration. The granularity increases, resulting in smooth and predictable steering angle sequences. We present a novel GPU-based algorithm and show its accuracy and realtime abilities with a number of real-world experiments.
The integration of high temperature thermal energy storages into existing conventional power plants can help to reduce the CO2 emissions of those plants and lead to lower capital expenditures for building energy storage systems, due to the use of synergy effects [1]. One possibility to implement that, is a molten salt storage system with a powerful power-to-heat unit. This paper presents two possible control concepts for the startup of the charging system of such a facility. The procedures are implemented in a detailed dynamic process model. The performance and safety regarding the film temperatures at heat transmitting surfaces are investigated in the process simulations. To improve the accuracy in predicting the film temperatures, CFD simulations of the electrical heater are carried out and the results are merged with the dynamic model. The results show that both investigated control concepts are safe regarding the temperature limits. The gradient controlled startup performed better than the temperature-controlled startup. Nevertheless, there are several uncertainties that need to be investigated further.
Grain boundary and surface segragation of Ba-Ti-O-Phases in rutile. O´Bryan, H. M.; Hagemann, H. J.
(1987)
Water suppliers are faced with the great challenge of achieving high-quality and, at the same time, low-cost water supply. In practice, the focus is set on the most beneficial maintenance measures and/or capacity adaptations of existing water distribution systems (WDS). Since climatic and demographic influences will pose further challenges in the future, the resilience enhancement of WDS, i.e. the enhancement of their capability to withstand and recover from disturbances, has been in particular focus recently. To assess the resilience of WDS, metrics based on graph theory have been proposed. In this study, a promising approach is applied to assess the resilience of the WDS for a district in a major German City. The conducted analysis provides insight into the process of actively influencing the
resilience of WDS