Conference Proceeding
Refine
Year of publication
- 2024 (6)
- 2023 (30)
- 2022 (44)
- 2021 (50)
- 2020 (55)
- 2019 (91)
- 2018 (77)
- 2017 (86)
- 2016 (80)
- 2015 (92)
- 2014 (67)
- 2013 (77)
- 2012 (89)
- 2011 (76)
- 2010 (73)
- 2009 (85)
- 2008 (61)
- 2007 (67)
- 2006 (94)
- 2005 (37)
- 2004 (37)
- 2003 (40)
- 2002 (34)
- 2001 (23)
- 2000 (19)
- 1999 (17)
- 1998 (23)
- 1997 (12)
- 1996 (7)
- 1995 (6)
- 1994 (4)
- 1993 (11)
- 1992 (6)
- 1991 (3)
- 1990 (2)
- 1989 (6)
- 1988 (3)
- 1987 (1)
- 1986 (2)
- 1985 (5)
- 1984 (3)
- 1983 (3)
- 1981 (2)
- 1980 (1)
- 1979 (1)
- 1978 (3)
- 1977 (2)
- 1975 (3)
- 1974 (1)
- 1973 (3)
Document Type
- Conference Proceeding (1620) (remove)
Language
- English (1146)
- German (472)
- Multiple languages (1)
- Spanish (1)
Keywords
- Biosensor (25)
- Blitzschutz (15)
- CAD (11)
- Finite-Elemente-Methode (11)
- civil engineering (11)
- Bauingenieurwesen (10)
- Lightning protection (9)
- Einspielen <Werkstoff> (6)
- Telekommunikationsmarkt (6)
- shakedown analysis (6)
- Enterprise Architecture (5)
- Gamification (5)
- Graduiertentagung (5)
- Leadership (5)
- Clusterion (4)
- Energy storage (4)
- Führung (4)
- Kanalisation (4)
- Limit analysis (4)
- Natural language processing (4)
Institute
- Fachbereich Elektrotechnik und Informationstechnik (296)
- Fachbereich Energietechnik (259)
- Fachbereich Medizintechnik und Technomathematik (239)
- Fachbereich Maschinenbau und Mechatronik (207)
- Fachbereich Luft- und Raumfahrttechnik (203)
- Solar-Institut Jülich (167)
- IfB - Institut für Bioengineering (151)
- Fachbereich Bauingenieurwesen (137)
- Fachbereich Wirtschaftswissenschaften (68)
- ECSM European Center for Sustainable Mobility (57)
- INB - Institut für Nano- und Biotechnologien (52)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (48)
- Fachbereich Chemie und Biotechnologie (34)
- Nowum-Energy (22)
- Kommission für Forschung und Entwicklung (16)
- Fachbereich Architektur (13)
- ZHQ - Bereich Hochschuldidaktik und Evaluation (10)
- FH Aachen (7)
- Fachbereich Gestaltung (4)
- IaAM - Institut für angewandte Automation und Mechatronik (3)
Concept, scientific research and managerial applications of Provocative Coaching, according to the „Provocative Therapy“ of Prof. Dr. Frank Farrelly (University of Wisconsin, U.S.A) in terms of an application of the Provocative Communication Style in specific situations of practical leadership, especially in the role of a coach for their subordinates.
Adaptive logistics : information management for planning and control of small series assembly
(2007)
Interplanetary trajectories for low-thrust spacecraft are often characterized by multiple revolutions around the sun. Unfortunately, the convergence of traditional trajectory optimizers that are based on numerical optimal control methods depends strongly on an adequate initial guess for the control function (if a direct method is used) or for the starting values of the adjoint vector (if an indirect method is used). Especially when many revolutions around the sun are re-
quired, trajectory optimization becomes a very difficult and time-consuming task that involves a lot of experience and expert knowledge in astrodynamics and optimal control theory, because an adequate initial guess is extremely hard to find. Evolutionary neurocontrol (ENC) was proposed as a smart method for low-thrust trajectory optimization that fuses artificial neural networks and evolutionary algorithms to so-called evolutionary neurocontrollers (ENCs) [1]. Inspired by natural archetypes, ENC attacks the trajectoryoptimization problem from the perspective of artificial intelligence and machine learning, a perspective that is quite different from that of optimal control theory. Within the context of ENC, a trajectory is regarded as the result of a spacecraft steering strategy that maps permanently the actual spacecraft state and the actual target state onto the actual spacecraft control vector. This way, the problem of searching the optimal spacecraft trajectory is equivalent to the problem of searching (or "learning") the optimal spacecraft steering strategy. An artificial neural network is used to implement such a spacecraft steering strategy. It can be regarded as a parameterized function (the network function) that is defined by the internal network parameters. Therefore, each distinct set of network parameters defines a different network function and thus a different steering strategy. The problem of searching the optimal steering strategy is now equivalent to the problem of searching the optimal set of network parameters. Evolutionary algorithms that work on a population of (artificial) chromosomes are used to find the optimal network parameters, because the parameters can be easily mapped onto a chromosome. The trajectory optimization problem is solved when the optimal chromosome is found. A comparison of solar sail trajectories that have been published by others [2, 3, 4, 5] with ENC-trajectories has shown that ENCs can be successfully applied for near-globally optimal spacecraft control [1, 6] and that they are able to find trajectories that are closer to the (unknown) global optimum, because they explore the trajectory search space more exhaustively than a human expert can do. The obtained trajectories are fairly accurate with respect to the terminal constraint. If a more accurate trajectory is required, the ENC-solution can be used as an initial guess for a local trajectory optimization method. Using ENC, low-thrust trajectories can be optimized without an initial guess and without expert attendance.
Here, new results for nuclear electric spacecraft and for solar sail spacecraft are presented and it will be shown that ENCs find very good trajectories even for very difficult problems. Trajectory optimization results are presented for 1. NASA's Solar Polar Imager Mission, a mission to attain a highly inclined close solar orbit with a solar sail [7] 2. a mission to de ect asteroid Apophis with a solar sail from a retrograde orbit with a very-high velocity impact [8, 9] 3. JPL's \2nd Global Trajectory Optimization Competition", a grand tour to visit four asteroids from different classes with a NEP spacecraft
Innovative interplanetary deep space missions, like a main belt asteroid sample
return mission, require ever larger velocity increments (∆V s) and thus ever
more demanding propulsion capabilities. Providing much larger exhaust velocities than chemical high-thrust systems, electric low-thrust space-propulsion
systems can significantly enhance or even enable such high-energy missions. In
1995, a European-Russian Joint Study Group (JSG) presented a study report
on “Advanced Interplanetary Missions Using Nuclear-Electric Propulsion”
(NEP). One of the investigated reference missions was a sample return (SR)
from the main belt asteroid (19) Fortuna. The envisaged nuclear power plant,
Topaz-25, however, could not be realized and also the worldwide developments
in space reactor hardware stalled. In this paper, we investigate, whether such
a mission is also feasible using a solar electric propulsion (SEP) system and
compare our SEP results to corresponding NEP results.
Near-Earth asteroid (NEA) 99942 Apophis provides a typical example for the evolution of asteroid orbits that lead to Earth-impacts after a close Earth-encounter that results in a resonant return. Apophis will have a close Earth-encounter in 2029 with potential very close subsequent Earth-encounters (or even an impact) in 2036 or later, depending on whether it passes through one of several less than 1 km-sized gravitational keyholes during its 2029-encounter. A pre-2029 kinetic impact is a very favorable option to nudge the asteroid out of a keyhole. The highest impact velocity and thus deflection can be achieved from a trajectory that is retrograde to Apophis orbit. With a chemical or electric propulsion system, however, many gravity assists and thus a long time is required to achieve this. We show in this paper that the solar sail might be the better propulsion system for such a mission: a solar sail Kinetic Energy Impactor (KEI) spacecraft could impact Apophis from a retrograde trajectory with a very high relative velocity (75-80 km/s) during one of its perihelion passages. The spacecraft consists of a 160 m × 160 m, 168 kg solar sail assembly and a 150 kg impactor. Although conventional spacecraft can also achieve the required minimum deflection of 1 km for this approx. 320 m-sized object from a prograde trajectory, our solar sail KEI concept also allows the deflection of larger objects. For a launch in 2020, we also show that, even after Apophis has flown through one of the gravitational keyholes in 2029, the solar sail KEI concept is still feasible to prevent Apophis from impacting the Earth, but many KEIs would be required for consecutive impacts to increase the total Earth-miss distance to a safe value
By DLR-contact, sample return missions to the large main-belt asteroid “19, Fortuna” have been studied. The mission scenario has been based on three ion thrusters of the RIT-22 model, which is presently under space qualification, and on solar arrays equipped with triple-junction GaAs solar cells. After having designed the spacecraft, the orbit-to-orbit trajectories for both, a one-way SEP mission with a chemical sample return and an all-SEP return mission, have been optimized using a combination of artificial neural networks with evolutionary algorithms. Additionally, body-to-body trajectories have been
investigated within a launch period between 2012 and 2015. For orbit-to-orbit calculation, the launch masses of the hybrid mission and of the all-SEP mission resulted in 2.05 tons and 1.56 tons, respectively, including a scientific payload of 246 kg. For the related transfer
durations 4.14 yrs and 4.62 yrs were obtained. Finally, a comparison between the mission scenarios based on SEP and on NEP have been carried out favouring clearly SEP.
We propose a simple parametric OSSD model that describes the variation of the sail film's optical coefficients with time, depending on the sail film's environmental history, i.e., the radiation dose. The primary intention of our model is not to describe the exact behavior of specific film-coating combinations in the real space environment, but to provide a more general parametric framework for describing the general optical degradation behavior of solar sails.
Since fluid-structure interaction within the finite-element method is state of the art in many engineering fields, this method is used in voice analysis. A quasi two-dimensional model of the vocal folds including the ventricular folds is presented. First results of self-sustained vocal fold oscillation are presented and possibilities as well as limitations are discussed.
Flow separation is a phenomenon that occurs in all kinds of supersonic nozzles sometimes during run-up and shut-down operations. Especially in expansion nozzles of rocket engines with large area ratio, flow separation can trigger strong side loads that can damage the structure of the nozzle. The investigation presented in this paper seeks to establish measures that may be applied to alter the point of flow separation. In order to achieve this, a supersonic nozzle was placed at the exit plane of the conical nozzle. This resulted in the generation of cross flow surrounding the core jet flow from the conical nozzle. Due to the entrainment of the gas stream from the conical nozzle the pressure in its exit plane was found to be lower than that of the ambient. A Cold gas instead of hot combustion gases was used as the working fluid. A mathematical simulation of the concept was validated by experiment. Measurements confirmed the simulation results that due to the introduction of a second nozzle the pressure in the separated region of the conical nozzle was significantly reduced. It was also established that the boundary layer separation inside the conical nozzle was delayed thus allowing an increased degree of overexpansion. The condition established by the pressure measurements was also demonstrated qualitatively using transparent nozzle configurations.
Abstract of the authors: In many areas of computer science ontologies become more and more important. The use of ontologies for domain modeling often brings up the issue of ontology integration. The task of merging several ontologies, covering specific subdomains, into one united ontology has to be solved. Many approaches for ontology integration aim at automating the process of ontology alignment. However, a complete automation is not feasible, and user interaction is always required. Nevertheless, most ontology integration tools offer only very limited support for the interactive part of the integration process. In this paper, we present a novel approach for the interactive integration of ontologies. The result of the ontology integration is incrementally updated after each definition of a correspondence between ontology elements. The user is guided through the ontologies to be integrated. By restricting the possible user actions, the integrity of all defined correspondences is ensured by the tool we developed. We evaluated our tool by integrating different regulations concerning building design.
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
On 1st January 1998, the German telecom market was fully liberalised. Since then genuine competition between market participants has developed, based on a comprehensive legal and regulatory framework that provides for safeguards against unfair competition and market power by Deutsche Telekom. Today, about 10 years after the liberalisation of the telecommunications sector a revision of this regulatory approach has become necessary because at least on three dimensions the situation is quite different from the one 10 years ago: First, with numerous established alternative operators in the market monopolies have been successfully challenged and competition introduced. Second, not only is Cable TV becoming in large parts of Germany a viable alternative for the provision of broadband services but also mobile services are becoming increasingly a substitute for fixed services. Last but not least there are important technological changes under way, requiring huge investments in infrastructure upgrades for next generation networks. In the light of these new developments the question is to which extent the current regulatory approach of severe ex-ante regulatory intervention is still appropriate. Is any part of the network of the former incumbent still a bottleneck? A more light handed regulatory approach might be the right response to this new situation. The paper is organised as follows: The first section will briefly examine the economic rationale for regulating network access. Based on the assumption that regulation is always necessary when bottlenecks exist regulatory principles for an efficient network access regime will be derived. The second section compares the situation of the German market in early 1998 with the one of today. Thereby three dimensions will be considered: the degree of competition, the potential for substitution and technological developments. The third section will define some requirements for the future regulation of telecom markets. Proposals will be elaborated how to ensure competitive telecom markets in the light of new economic and technological challenges.
Inhaltsverzeichnis: Kettern, Jürgen : Grußwort des Dekans Kleiker, Walter : Grußwort des Sektionsleiters Bau des alpha.net Wijnhoven, Ralf : Bauausführung von 14 Brunnenanlagen und 4 km Rohrleitung zur Regelung des Grundwasserstandes und der Vorflut in Voerde-Mehrum. S. 10-11 Ruwisch, Thomas : Spurplanänderung im Bahnhof Düsseldorf-Gerresheim : Wusch, Planung, Planrecht und die Umwelt. S. 12-13. Rickes, Tanja : Bauablaufsteuerung einer Linienbaustelle am Beispiel des Fahrbahnbaus der "HSL-Zuid". S. 14-15. Plum, Andreas : Brandschutzingenieurwesen - Eine Ingenieurdisziplim hat die Akzeptanzschwelle überschritten. Teil 1: Brandsimulationsberechnung. S. 18-19 Flesch, Tobias : Brandschutzingenieurwesen - Eine Ingenieurdisziplin hat die Akzeptanzschwelle überschritten. Teil 2: Tragwerksbemessung im Brandfall. S. 20-21. Müller, Bert : Neubau für die Mediengruppe RTL-Deutschland - High-Tech in dankmalgeschützter Fassade. S. 24-25. Hermens, Gereon : Erhöhung der Akzeptanz von Wasserkraft durch den Bau von Fischaufstiegsanlagen: Umweltschonende Energie für Natur und Klima. S. 26-27 Klitzing, Bastian : Kläranlagenoptimierung : Zukunftsweisende Planungen im Bereich Abwasserreinigung und Energiegewinnung. S. 28-29. Ginster, Marco : Die neue Rheinbrücke Wesel. S. 32-33 Tilke, Hermann ; Wersch, Ralf van : Bauen für einen schnellen Sport. S. 34-35.
- Wie kam es zu der globalen Finanzkrise? - Wodurch zeichnet sich die Finanzkrise aus? - Wer hat die Finanzkrise verschuldet? - Welche Rolle spielten Spekulanten? - Inwiefern ist die Finanzkrise selbstverschuldet? - Ist die Globalisierung Schuld an den Finanzkrisen in zahlreichen Regionen dieser Welt? - Wie wirkt sich die Finanzkrise auf die Realwirtschaft aus? - Welche Wege bieten sich, aus der Krise herauszukommen?
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
Numerical models have become an essential part of snow avalanche engineering. Recent
advances in understanding the rheology of flowing snow and the mechanics of entrainment and
deposition have made numerical models more reliable. Coupled with field observations and historical
records, they are especially helpful in understanding avalanche flow in complex terrain. However, the
application of numerical models poses several new challenges to avalanche engineers. A detailed
understanding of the avalanche phenomena is required to specify initial conditions (release zone
dimensions and snowcover entrainment rates) as well as the friction parameters, which are no longer
based on empirical back-calculations, rather terrain roughness, vegetation and snow properties. In this
paper we discuss these problems by presenting the computer model RAMMS, which was specially
designed by the SLF as a practical tool for avalanche engineers. RAMMS solves the depth-averaged
equations governing avalanche flow with first and second-order numerical solution schemes. A
tremendous effort has been invested in the implementation of advanced input and output features.
Simulation results are therefore clearly and easily visualized to simplify their interpretation. More
importantly, RAMMS has been applied to a series of well-documented avalanches to gauge model
performance. In this paper we present the governing differential equations, highlight some of the input
and output features of RAMMS and then discuss the simulation of the Gatschiefer avalanche that
occurred in April 2008, near Klosters/Monbiel, Switzerland.
Time-of-flight (ToF) sensors have become an alternative to conventional distance sensing techniques like laser scanners or image based stereo. ToF sensors provide full range distance information at high frame-rates and thus have a significant impact onto current research in areas like online object recognition, collision prevention or scene reconstruction. However, ToF cameras like the photonic mixer device (PMD) still exhibit a number of challenges regarding static and dynamic effects, e.g. systematic distance errors and motion artefacts, respectively. Sensor calibration techniques reducing static system errors have been proposed and show promising results. However, current calibration techniques in general need a large set of reference data in order to determine the corresponding parameters for the calibration model. This paper introduces a new calibration approach which combines different demodulation techniques for the ToF- camera 's reference signal. Examples show, that the resulting combined demodulation technique yields improved distance values based on only two required reference data sets.
Solar tower power plants
(2008)
Optimization of the reaeration potential on embankment stepped spillways in skimming flow regime
(2008)
Tool supported requirements analysis for the user centered development of mobile enterprise software
(2008)
A user centered development method has proved satisfactory for the development of mobile enterprise software. To make use of this method, detailed information about the user and the place where the user interacts with his mobile device is required. This article describes how both can be modeled by a stereotypical and conceptual extended UML extension. Finally, a software tool is presented that supports the developed UML extension.
Die ökologische Umgestaltung der Dhünn in Leverkusen – Ein Beispiel für die praktische Planung
(2009)
Dipl.-Ing. (FH) Dieter Stein M.Sc (Environmental Engineering) - Landschaftsarchitekt AKNW, Grevenbroich. 14 Seiten ( S. 78-91). Beitrag zum 2. Aachener Softwaretag in der Wasserwirtschaft <2, 2009, Aachen>. Zusammenfassung [des Autors] Unter Berücksichtigung vorhandener örtlicher Gegebenheiten und technischer Einrichtungen (wie z.B. Verkehrstrassen, Versorgungsleitungen, Einlaufbauwerke etc.) hat die Dhünn innerhalb der Deiche mehr Bewegungsfreiheit erhalten. Der Hochwasserschutz bleibt dabei weiterhin gewährleistet. Die Gehölzstrukturen im Uferbereich wurden weitgehend erhalten. Dort, wo die Bauarbeiten eine Entfernung von Gehölzen erforderlich machten, wurden die Flächen bewusst einer eigenständigen Entwicklung überlassen. In diesen Bereichen werden sich im Laufe der Zeit von selbst neue Pflanzen und Gehölze ansiedeln. Mit der naturnahen Umgestaltung der Dhünn als "Starthilfe" kann die Natur die weitere Entwicklung übernehmen: Tier- und Pflanzenarten können sich das ehemals kanalartig ausgebaute Gewässer "zurückerobern", Fische und Kleintiere finden Nahrung und Ruheplätze. Für die Menschen entsteht ein wertvoller Erholungsraum mitten in der Stadt. Trotz zahlreicher Akteure und erheblicher Restriktionen konnte das Projekt in relativ kurzer Zeit realisiert werden. Dem Wupperverband (Untere Lichtenplatzer Str. 100; 42289 Wuppertal) und hier vor allem den Projektleitern Herrn Arnim Lützenberger und Herrn Andreas Oberborbeck sei an dieser Stelle besonders für eine hervorragende Zusammenarbeit gedankt.
In the presented paper data collected from the field related to damage statistics of electrical and electronic apparatus in household are reported and investigated. These damages (total number approx. 74000 cases), registered by five German insurance companies in 2005 and 2006, were adviced by customers as caused by lightning overvoltages. With the use of stochastical methods it is possible, to reasses the collected data and to distinguish between cases, which are with high probability caused by lightning overvoltages, and those, which are not. If there was an indication for a direct strike, this case was excluded, so the focus was only on indirect lightning flashes, i.e. only flashes to ground near the structure and flashes to or nearby an incoming service line were investigated. The data from the field contain the location of damaged apparatus (residence of the policy holder) and the distances of the nearest cloud-to-ground stroke to the location of the damage registered by the German lightning location network BLIDS at the date of damage. The statistical data along with some complementary numerical simulations allow to verify the correspondence of the Standards rules used for IEC 62305-2 with the field data and to define some correction needs. The results could lead to a better understanding whether a damage reported to an insurance company is really caused by indirect lightning, or not.
[Paper of the X International Symposium on Lightning Protection 9th - 13th November, 2009 - Curitiba, Brazil. 6 pages] The international standard IEC 62305-3, published in 2006, requires as an integral part of the lightning protection system (LPS) the consideration of a separation distance between the conductors of the LPS and metal and electrical installations inside the structure to be protected. IEC 62305-3 gives two different methods for this calculation: a standard, simplified approach and a more detailed approach, which differ especially regarding the treatment of the current sharing effect on the LPS conductors. Hence, different results for the separation distance are possible, leading to some discrepancies in the use of the standard. The standard approach defined in the main part (Clause 6.3) and in Annex C of the standard in some cases may lead to a severe oversizing of the required separation distance. The detailed approach described in Annex E naturally gives more correct results. However, a calculation of the current sharing amongst all parts of the air-termination and downconductor network is necessary, in many cases requiring the use of network analysis programs. In this paper simplified methods for the assessment of the current sharing are presented, which are easy to use as well as sufficiently adequate.
8. VDE/ABB-Blitzschutztagung, 29. - 30. Oktober 2009 in Neu-Ulm. Blitzschutztagung <8, 2009, Neu-Ulm> Berlin : VDE Verl. 2009 Großkraftwerke können durch Blitzentladungen mit potentiellen Auswirkungen auf deren Verfügbarkeit und Sicherheit gefährdet werden. Ein sehr spezielles Szenario, welches aus aktuellem Anlass zu untersuchen war, betrifft den kraftwerksnahen Blitzeinschlag in die Hochspannungs-Freileitung am Netzanschluss der Anlage. Wird nun noch ein sogenannter Schirmfehler unterstellt, d.h. der direkte Blitzeinschlag erfolgt in ein Leiterseil des Hoch- bzw. Höchstspannungsnetzes und nicht in das darüber gespannte Erdseil, so bedeutet dies eine extreme elektromagnetische Einwirkung. Der vorliegende Beitrag befasst sich mit der Simulation eines solchen Blitzeinschlages und dessen Auswirkungen auf den Netzanschluss und die Komponenten der elektrischen Eigenbedarfsanlagen eines Kraftwerks auf den unterlagerten Spannungsebenen. Die dabei gewonnenen Erkenntnisse lassen sich ohne Einschränkungen auf Industrieanlagen mit Mittelspannungs-Netzanschluss und ohne eigener Stromversorgung übertragen.
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".
Working paper distributed at 2nd Annual Next Generation Telecommunications Conference 2009, 13th – 14th October 2009, Brussels 14 pages Abstract Governments all over Europe are in the process of adopting new broadband strategies. The objective is to create modern telecommunications networks based on powerful broadband infrastructures". In doing so, they aim for innovative and investment-friendly concepts. For instance, in a recently published consultation paper on the subject the German regulator BNetzA declared that it will take “greater account of … reducing risks, securing the investment and innovation power, providing planning certainty and transparency – in order to support and advance broadband rollout in Germany”. It further states that when regulating wholesale rates it has to be ensured that “… adequate incentives for network rollout are provided on the one hand, while sustainable and fair competition is ensured on the other”. Also an EC draft recommendation on regulated network access is about to set new standards for the regulation of next generation access networks. According to the recommendation the prices of new assets shall be based on costs plus a projectspecific risk premium to be included in the costs of capital for the investment risk incurred by the operator. This approach has been criticised from various sides. In particular it has been questioned whether such an approach is adequate to meet the objectives of encouraging both competition and investment into next generation access networks. Against this background, the concept of “long term risk sharing contracts” has been proposed recently as an approach which does not only incorporate the various additional risks involved in the deployment of NGA infrastructure, but has several other advantages. This paper will demonstrate that the concept allows for competition to evolve at both the retail and wholesale level on fair, objective, non-discriminatory and transparent terms and conditions. Moreover, it ensures the highest possible investment incentive in line with socially desirable outcome. The paper is organised as follows: The next section will briefly outline the importance of encouraging competition and investment in an NGA-environment. The third section will specify the design of long term risk sharing contracts in view of achieving these objectives. The fourth section will examine potential problems associated with the concept. In doing so a way of how to deal with them will be elaborated. The last section will look at arguments against long term risk sharing contracts. It will be shown that these arguments are not strong enough to build a case against introducing such contracts.
Dipl.-Ing. Stefan Overkamp - GISWORKS GbR, Velbert. 11 S. (S. 7-17). Beitrag zum 2. Aachener Softwaretag in der Wasserwirtschaft <2, 2009, Aachen> Aus der Gliederung: 1 Geoinformationssysteme 2 Anforderungen an eine kommunale Geodateninfrastruktur (GDI) 3 Komponenten einer GDI 3.1 Geodatenmanagement 3.2 Geodatenhaltung 3.3 High-End-GIS 3.4 intraGIS 4 GISWORKS 5 Weiterführende Informationen
Dr.-Ing. Oliver Stoschek [u.a.] DHI Wasser und Umwelt GmbH, Syke. 14 S. (S. 45-58). Beitrag zum 2. Aachener Softwaretag in der Wasserwirtschaft <2, 2009, Aachen> Ökologische Modelle zur Berechnung der Temperaturveränderung und den Einfluss von Kühlwasser auf die Wassertemperatur der Elbe. Die vorgelegte Arbeit wurde beauftragt und unterstützt von der ARGE Elbe im Zuge der Neuauflage des Wärmelastplans Elbe.
Mischwassereinleitungen in Gewässer nach BWK Merkblatt M3 - Vorteile des detaillierten Nachweises
(2009)
Dipl.-Ing. Brigitte Huber und Dr.-Ing. Gerd Demny - Wasserverband Eifel Rur, Düren. 16 Seiten ( S. 59-74). Beitrag zum 2. Aachener Softwaretag in der Wasserwirtschaft <2, 2009, Aachen> Zusammenfassung [der Autoren] Für das städtisch geprägte Einzugsgebiet des Broicher Baches sind ein vereinfachter und ein detaillierter Nachweis nach BWK-M3 durchgeführt worden. Dabei zeigt sich, dass die Methodik des vereinfachten Nachweises nicht geeignet ist, um eine realitätsnahe Abbildung der einleitungsgeprägten Abflüsse des Gewässers zu erhalten. Dies ist insbesondere auf die Vernachlässigung von Wellentranslation und -retention im Gerinne zurückzuführen. Die dadurch entstehende Fehleinschätzung der Abflussverhältnisse versperrt den Blick auf eine situationsgerechte Maßnahmenplanung. Der mit Hilfe eines NA-Modells geführte detaillierte Nachweis ist zwar in der Erstellung aufwändiger, zeichnet aber ein reales Bild der Abflusserhöhung durch Einleitungen. Mit Hilfe des Modells können die wesentlichen Einflüsse schnell lokalisiert und zielführende Maßnahmenvarianten identifiziert werden. In dem hier vorgestellten Beispiel des Broicher Baches können die ursprünglich identifizierten acht Maßnahmen auf eine reduziert werden. Das Gesamtvolumen der erforderlichen Rückhaltungen wird um die Hälfte verringert. Der Vergleich beider Nachweismethoden legt nach Ansicht der Autoren nahe, den vereinfachten Nachweis höchstens für eine erste Einschätzung des Maßnahmenbedarfs anzuwenden. Die Maßnahmenidentifikation und -dimensionierung sollte grundsätzlich mit der detaillierten Nachweismethode durchgeführt werden, die auf einem entsprechenden NA-Modell basiert. Dies gilt insbesondere für Gewässerstrecken, deren Abfluss durch mehrere, hintereinander liegende Einleitungsstellen geprägt ist.
Dipl.-Ing. Hartmut Hoevel - Erftverband, Bergheim. 3 Seiten (S. 75-77). Beitrag zum 2. Aachener Softwaretag in der Wasserwirtschaft <2, 2009, Aachen> Die Erft ist ein linker Nebenfluss des Rheins. Der Fluss ist durch eingeleitetes Sümpfungswasser aus dem rheinischen Braunkohletagebau hinsichtlich der Wassermenge und der Temperatur stark belastet. Die vorgestellten Überlegungen betreffen die Fragen: 1. Wie soll die Erft im Jahr 2045, also nach dem voraussichtlichen Ende der Tagebauaktivitäten im Rheinischen Braunkohlerevier und damit einhergehender Beenigung der Sümpfungswassereinleitung, aussehen? 2. Wie schnell und auf welche Weise soll die Umgestaltung der Erft vonstatten gehen?
One of interesting but not well known water properties is related to appearance of highly ordered structures in response to strong electrical field. In 1893 Sir William Armstrong placed a cotton thread between two wine glasses filled with chemically pure water. When high DC voltage was applied between the glasses, a connection consisting of water formed, producing a "water bridge"
The absence of a general method for endotoxin removal from liquid interfaces gives an opportunity to find new methods and materials to overcome this gap. Activated nanostructured carbon is a promising material that showed good adsorption properties due to its vast pore network and high surface area. The aim of this study is to find the adsorption rates for a carboneous material produced at different temperatures, as well as to reveal possible differences between the performance of the material for each of the adsorbates used during the study (hemoglobin, serum albumin and lipopolysaccharide, LPS).