Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (234)
- Fachbereich Medizintechnik und Technomathematik (212)
- Fachbereich Luft- und Raumfahrttechnik (183)
- Fachbereich Energietechnik (177)
- IfB - Institut für Bioengineering (148)
- Solar-Institut Jülich (110)
- Fachbereich Maschinenbau und Mechatronik (107)
- Fachbereich Bauingenieurwesen (75)
- Fachbereich Wirtschaftswissenschaften (55)
- ECSM European Center for Sustainable Mobility (53)
Language
- English (1170) (remove)
Document Type
- Conference Proceeding (1170) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Blitzschutz (6)
- Enterprise Architecture (5)
- Clusterion (4)
- Energy storage (4)
- Gamification (4)
Interplanetary trajectories for low-thrust spacecraft are often characterized by multiple revolutions around the sun. Unfortunately, the convergence of traditional trajectory optimizers that are based on numerical optimal control methods depends strongly on an adequate initial guess for the control function (if a direct method is used) or for the starting values of the adjoint vector (if an indirect method is used). Especially when many revolutions around the sun are re-
quired, trajectory optimization becomes a very difficult and time-consuming task that involves a lot of experience and expert knowledge in astrodynamics and optimal control theory, because an adequate initial guess is extremely hard to find. Evolutionary neurocontrol (ENC) was proposed as a smart method for low-thrust trajectory optimization that fuses artificial neural networks and evolutionary algorithms to so-called evolutionary neurocontrollers (ENCs) [1]. Inspired by natural archetypes, ENC attacks the trajectoryoptimization problem from the perspective of artificial intelligence and machine learning, a perspective that is quite different from that of optimal control theory. Within the context of ENC, a trajectory is regarded as the result of a spacecraft steering strategy that maps permanently the actual spacecraft state and the actual target state onto the actual spacecraft control vector. This way, the problem of searching the optimal spacecraft trajectory is equivalent to the problem of searching (or "learning") the optimal spacecraft steering strategy. An artificial neural network is used to implement such a spacecraft steering strategy. It can be regarded as a parameterized function (the network function) that is defined by the internal network parameters. Therefore, each distinct set of network parameters defines a different network function and thus a different steering strategy. The problem of searching the optimal steering strategy is now equivalent to the problem of searching the optimal set of network parameters. Evolutionary algorithms that work on a population of (artificial) chromosomes are used to find the optimal network parameters, because the parameters can be easily mapped onto a chromosome. The trajectory optimization problem is solved when the optimal chromosome is found. A comparison of solar sail trajectories that have been published by others [2, 3, 4, 5] with ENC-trajectories has shown that ENCs can be successfully applied for near-globally optimal spacecraft control [1, 6] and that they are able to find trajectories that are closer to the (unknown) global optimum, because they explore the trajectory search space more exhaustively than a human expert can do. The obtained trajectories are fairly accurate with respect to the terminal constraint. If a more accurate trajectory is required, the ENC-solution can be used as an initial guess for a local trajectory optimization method. Using ENC, low-thrust trajectories can be optimized without an initial guess and without expert attendance.
Here, new results for nuclear electric spacecraft and for solar sail spacecraft are presented and it will be shown that ENCs find very good trajectories even for very difficult problems. Trajectory optimization results are presented for 1. NASA's Solar Polar Imager Mission, a mission to attain a highly inclined close solar orbit with a solar sail [7] 2. a mission to de ect asteroid Apophis with a solar sail from a retrograde orbit with a very-high velocity impact [8, 9] 3. JPL's \2nd Global Trajectory Optimization Competition", a grand tour to visit four asteroids from different classes with a NEP spacecraft
Flow separation is a phenomenon that occurs in all kinds of supersonic nozzles sometimes during run-up and shut-down operations. Especially in expansion nozzles of rocket engines with large area ratio, flow separation can trigger strong side loads that can damage the structure of the nozzle. The investigation presented in this paper seeks to establish measures that may be applied to alter the point of flow separation. In order to achieve this, a supersonic nozzle was placed at the exit plane of the conical nozzle. This resulted in the generation of cross flow surrounding the core jet flow from the conical nozzle. Due to the entrainment of the gas stream from the conical nozzle the pressure in its exit plane was found to be lower than that of the ambient. A Cold gas instead of hot combustion gases was used as the working fluid. A mathematical simulation of the concept was validated by experiment. Measurements confirmed the simulation results that due to the introduction of a second nozzle the pressure in the separated region of the conical nozzle was significantly reduced. It was also established that the boundary layer separation inside the conical nozzle was delayed thus allowing an increased degree of overexpansion. The condition established by the pressure measurements was also demonstrated qualitatively using transparent nozzle configurations.
On 1st January 1998, the German telecom market was fully liberalised. Since then genuine competition between market participants has developed, based on a comprehensive legal and regulatory framework that provides for safeguards against unfair competition and market power by Deutsche Telekom. Today, about 10 years after the liberalisation of the telecommunications sector a revision of this regulatory approach has become necessary because at least on three dimensions the situation is quite different from the one 10 years ago: First, with numerous established alternative operators in the market monopolies have been successfully challenged and competition introduced. Second, not only is Cable TV becoming in large parts of Germany a viable alternative for the provision of broadband services but also mobile services are becoming increasingly a substitute for fixed services. Last but not least there are important technological changes under way, requiring huge investments in infrastructure upgrades for next generation networks. In the light of these new developments the question is to which extent the current regulatory approach of severe ex-ante regulatory intervention is still appropriate. Is any part of the network of the former incumbent still a bottleneck? A more light handed regulatory approach might be the right response to this new situation. The paper is organised as follows: The first section will briefly examine the economic rationale for regulating network access. Based on the assumption that regulation is always necessary when bottlenecks exist regulatory principles for an efficient network access regime will be derived. The second section compares the situation of the German market in early 1998 with the one of today. Thereby three dimensions will be considered: the degree of competition, the potential for substitution and technological developments. The third section will define some requirements for the future regulation of telecom markets. Proposals will be elaborated how to ensure competitive telecom markets in the light of new economic and technological challenges.
Numerical models have become an essential part of snow avalanche engineering. Recent
advances in understanding the rheology of flowing snow and the mechanics of entrainment and
deposition have made numerical models more reliable. Coupled with field observations and historical
records, they are especially helpful in understanding avalanche flow in complex terrain. However, the
application of numerical models poses several new challenges to avalanche engineers. A detailed
understanding of the avalanche phenomena is required to specify initial conditions (release zone
dimensions and snowcover entrainment rates) as well as the friction parameters, which are no longer
based on empirical back-calculations, rather terrain roughness, vegetation and snow properties. In this
paper we discuss these problems by presenting the computer model RAMMS, which was specially
designed by the SLF as a practical tool for avalanche engineers. RAMMS solves the depth-averaged
equations governing avalanche flow with first and second-order numerical solution schemes. A
tremendous effort has been invested in the implementation of advanced input and output features.
Simulation results are therefore clearly and easily visualized to simplify their interpretation. More
importantly, RAMMS has been applied to a series of well-documented avalanches to gauge model
performance. In this paper we present the governing differential equations, highlight some of the input
and output features of RAMMS and then discuss the simulation of the Gatschiefer avalanche that
occurred in April 2008, near Klosters/Monbiel, Switzerland.
Solar tower power plants
(2008)
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
Optimization of the reaeration potential on embankment stepped spillways in skimming flow regime
(2008)
Tool supported requirements analysis for the user centered development of mobile enterprise software
(2008)
A user centered development method has proved satisfactory for the development of mobile enterprise software. To make use of this method, detailed information about the user and the place where the user interacts with his mobile device is required. This article describes how both can be modeled by a stereotypical and conceptual extended UML extension. Finally, a software tool is presented that supports the developed UML extension.
Time-of-flight (ToF) sensors have become an alternative to conventional distance sensing techniques like laser scanners or image based stereo. ToF sensors provide full range distance information at high frame-rates and thus have a significant impact onto current research in areas like online object recognition, collision prevention or scene reconstruction. However, ToF cameras like the photonic mixer device (PMD) still exhibit a number of challenges regarding static and dynamic effects, e.g. systematic distance errors and motion artefacts, respectively. Sensor calibration techniques reducing static system errors have been proposed and show promising results. However, current calibration techniques in general need a large set of reference data in order to determine the corresponding parameters for the calibration model. This paper introduces a new calibration approach which combines different demodulation techniques for the ToF- camera 's reference signal. Examples show, that the resulting combined demodulation technique yields improved distance values based on only two required reference data sets.
In the presented paper data collected from the field related to damage statistics of electrical and electronic apparatus in household are reported and investigated. These damages (total number approx. 74000 cases), registered by five German insurance companies in 2005 and 2006, were adviced by customers as caused by lightning overvoltages. With the use of stochastical methods it is possible, to reasses the collected data and to distinguish between cases, which are with high probability caused by lightning overvoltages, and those, which are not. If there was an indication for a direct strike, this case was excluded, so the focus was only on indirect lightning flashes, i.e. only flashes to ground near the structure and flashes to or nearby an incoming service line were investigated. The data from the field contain the location of damaged apparatus (residence of the policy holder) and the distances of the nearest cloud-to-ground stroke to the location of the damage registered by the German lightning location network BLIDS at the date of damage. The statistical data along with some complementary numerical simulations allow to verify the correspondence of the Standards rules used for IEC 62305-2 with the field data and to define some correction needs. The results could lead to a better understanding whether a damage reported to an insurance company is really caused by indirect lightning, or not.
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".
Working paper distributed at 2nd Annual Next Generation Telecommunications Conference 2009, 13th – 14th October 2009, Brussels 14 pages Abstract Governments all over Europe are in the process of adopting new broadband strategies. The objective is to create modern telecommunications networks based on powerful broadband infrastructures". In doing so, they aim for innovative and investment-friendly concepts. For instance, in a recently published consultation paper on the subject the German regulator BNetzA declared that it will take “greater account of … reducing risks, securing the investment and innovation power, providing planning certainty and transparency – in order to support and advance broadband rollout in Germany”. It further states that when regulating wholesale rates it has to be ensured that “… adequate incentives for network rollout are provided on the one hand, while sustainable and fair competition is ensured on the other”. Also an EC draft recommendation on regulated network access is about to set new standards for the regulation of next generation access networks. According to the recommendation the prices of new assets shall be based on costs plus a projectspecific risk premium to be included in the costs of capital for the investment risk incurred by the operator. This approach has been criticised from various sides. In particular it has been questioned whether such an approach is adequate to meet the objectives of encouraging both competition and investment into next generation access networks. Against this background, the concept of “long term risk sharing contracts” has been proposed recently as an approach which does not only incorporate the various additional risks involved in the deployment of NGA infrastructure, but has several other advantages. This paper will demonstrate that the concept allows for competition to evolve at both the retail and wholesale level on fair, objective, non-discriminatory and transparent terms and conditions. Moreover, it ensures the highest possible investment incentive in line with socially desirable outcome. The paper is organised as follows: The next section will briefly outline the importance of encouraging competition and investment in an NGA-environment. The third section will specify the design of long term risk sharing contracts in view of achieving these objectives. The fourth section will examine potential problems associated with the concept. In doing so a way of how to deal with them will be elaborated. The last section will look at arguments against long term risk sharing contracts. It will be shown that these arguments are not strong enough to build a case against introducing such contracts.