Refine
Year of publication
Document Type
- Conference Proceeding (130)
- Article (77)
- Lecture (5)
- Working Paper (4)
- Poster (1)
- Talk (1)
Language
- English (218) (remove)
Has Fulltext
- yes (218) (remove)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Clusterion (5)
- shakedown analysis (5)
- Air purification (4)
- Hämoglobin (4)
- Limit analysis (4)
- Luftreiniger (4)
- Plasmacluster ion technology (4)
- Raumluft (4)
- Shakedown analysis (4)
- Sonde (4)
- Telekommunikationsmarkt (4)
- Einspielanalyse (3)
Institute
- Fachbereich Medizintechnik und Technomathematik (123)
- IfB - Institut für Bioengineering (60)
- Fachbereich Energietechnik (27)
- INB - Institut für Nano- und Biotechnologien (23)
- Fachbereich Maschinenbau und Mechatronik (17)
- Fachbereich Luft- und Raumfahrttechnik (14)
- Fachbereich Chemie und Biotechnologie (12)
- Fachbereich Wirtschaftswissenschaften (11)
- Fachbereich Bauingenieurwesen (7)
- Fachbereich Elektrotechnik und Informationstechnik (7)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (5)
- ECSM European Center for Sustainable Mobility (4)
- Nowum-Energy (4)
- Solar-Institut Jülich (4)
- Arbeitsstelle fuer Hochschuldidaktik und Studienberatung (1)
- Fachbereich Gestaltung (1)
- IBB - Institut für Baustoffe und Baukonstruktionen (1)
The Scarab Project
(2015)
Urban Search and Rescue (USAR) is an active research
field in the robotics community. Despite recent advances
for many open research questions, these kind of systems are
not widely used in real rescue missions. One reason is that such
systems are complex and not (yet) very reliable; another is that
one has to be an robotic expert to run such a system. Moreover,
available rescue robots are very expensive and the benefits of
using them are still limited.
In this paper, we present the Scarab robot, an alternative
design for a USAR robot. The robot is light weight, humanpackable
and its primary purpose is that of extending the
rescuer’s capability to sense the disaster site. The idea is that a
responder throws the robot to a certain spot. The robot survives
the impact with the ground and relays sensor data such as
camera images or thermal images to the responder’s hand-held
control unit from which the robot can be remotely controlled.
The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems.
The Ministry of Science and Research in North Rhine-Westphalia created eight platforms of excellence, one in the research area „Energy and Environment“ in 2002 at ACUAS. This platform concentrates the research and development of 13 professors in Jülich and Aachen and of two scientific institutes with different topics: – NOWUM-Energy with emphasis on efficient and economic energy conversion – The Solar Institute Jülich – SIJ – being the largest research institute in the field of renewables at a University of Applied Sciences in Germany With this platform each possible energy conversion – nuclear, fossil, renewable- can be dealt with to help solving the two most important problems of mankind, energy and potable water. At the CSE are presented the historical development, some research results and the combined master studies in „Energy Systems“ and „Nuclear Applications“
One of the most important parameters in a burning chamber - in power stations, in waste to energy plants - is the temperature. This temperature is in the range of 700-1500 °C - one of the most advanced measuring methods being the acoustic pyrometry with the possibility of producing temperature mapping in one level of the burning chamber - comparable to computer tomography. The results of these measurements discussed in the presentation can be used - to fulfil the legal requirements in the FRG or in the EU - to equalise the temperature in one level of the burning chamber to optimise the steam production (better efficiency of the plant) and to minimise the production of temperature controlled flue gas components (NO, CO a. o.) - to control the SNCR-process if used.
Multi-interface level sensors and new development in monitoring and control of oil separators
(2006)
In the oil industry, huge saving may be made if suitable multi-interface level measurement systems are employed for effectively monitoring crude oil separators and efficient control of their operation. A number of techniques, e.g. externally mounted displacers, differential pressure transmitters and capacitance rod devices, have been developed to measure the separation process with gas, oil, water and other components. Because of the unavailability of suitable multi-interface level measurement systems, oil separators are currently operated by the trial-and-error approach. In this paper some conventional techniques, which have been used for level measurement in industry, and new development are discussed.
A multi-sensor system is a chemical sensor system which quantitatively and qualitatively records gases with a combination of cross-sensitive gas sensor arrays and pattern recognition software. This paper addresses the issue of data analysis for identification of gases in a gas sensor array. We introduce a software tool for gas sensor array configuration and simulation. It concerns thereby about a modular software package for the acquisition of data of different sensors. A signal evaluation algorithm referred to as matrix method was used specifically for the software tool. This matrix method computes the gas concentrations from the signals of a sensor array. The software tool was used for the simulation of an array of five sensors to determine gas concentration of CH4, NH3, H2, CO and C2H5OH. The results of the present simulated sensor array indicate that the software tool is capable of the following: (a) identify a gas independently of its concentration; (b) estimate the concentration of the gas, even if the system was not previously exposed to this concentration; (c) tell when a gas concentration exceeds a certain value. A gas sensor data base was build for the configuration of the software. With the data base one can create, generate and manage scenarios and source files for the simulation. With the gas sensor data base and the simulation software an on-line Web-based version was developed, with which the user can configure and simulate sensor arrays on-line.
This study investigated the anaerobic digestion of an algal–bacterial biofilm grown in artificial wastewater in an Algal Turf Scrubber (ATS). The ATS system was located in a greenhouse (50°54′19ʺN, 6°24′55ʺE, Germany) and was exposed to seasonal conditions during the experiment period. The methane (CH4) potential of untreated algal–bacterial biofilm (UAB) and thermally pretreated biofilm (PAB) using different microbial inocula was determined by anaerobic batch fermentation. Methane productivity of UAB differed significantly between microbial inocula of digested wastepaper, a mixture of manure and maize silage, anaerobic sewage sludge, and percolated green waste. UAB using sewage sludge as inoculum showed the highest methane productivity. The share of methane in biogas was dependent on inoculum. Using PAB, a strong positive impact on methane productivity was identified for the digested wastepaper (116.4%) and a mixture of manure and maize silage (107.4%) inocula. By contrast, the methane yield was significantly reduced for the digested anaerobic sewage sludge (50.6%) and percolated green waste (43.5%) inocula. To further evaluate the potential of algal–bacterial biofilm for biogas production in wastewater treatment and biogas plants in a circular bioeconomy, scale-up calculations were conducted. It was found that a 0.116 km2 ATS would be required in an average municipal wastewater treatment plant which can be viewed as problematic in terms of space consumption. However, a substantial amount of energy surplus (4.7–12.5 MWh a−1) can be gained through the addition of algal–bacterial biomass to the anaerobic digester of a municipal wastewater treatment plant. Wastewater treatment and subsequent energy production through algae show dominancy over conventional technologies.
The present work aimed to study the mainstream feasibility of the deammonifying sludge of side stream of municipal wastewater treatment plant (MWWTP) in Kaster, Germany. For this purpose, the deammonifying sludge available at the side stream was investigated for nitrogen (N) removal with respect to the operational factors temperature (15–30°C), pH value (6.0–8.0) and chemical oxygen demand (COD)/N ratio (≤1.5–6.0). The highest and lowest N-removal rates of 0.13 and 0.045 kg/(m³ d) are achieved at 30 and 15°C, respectively. Different conditions of pH and COD/N ratios in the SBRs of Partial nitritation/anammox (PN/A) significantly influenced both the metabolic processes and associated N-removal rates. The scientific insights gained from the current work signifies the possibility of mainstream PN/A at WWTPs. The current study forms a solid basis of operational window for the upcoming semi-technical trails to be conducted prior to the full-scale mainstream PN/A at WWTP Kaster and WWTPs globally.
The ”IceMole“ is a novel maneuverable subsurface ice probe for clean in-situ analysis and sampling of subsurface ice and subglacial water/brine. It is developed and build at FH Aachen University of Applied Sciences’ Astronautical Laboratory. A first prototype was successfully tested on the Swiss Morteratsch glacier in 2010. Clean sampling is achieved with a hollow ice screw (as it is used in mountaineering) at the tip of the probe. Maneuverability is achieved with a differentially heated melting head. Funded by the German Space Agency (DLR), a consortium led by FH Aachen currently develops a much more advanced IceMole probe, which includes a sophisticated system for obstacle avoidance, target detection, and navigation in the ice. We intend to use this probe for taking clean samples of subglacial brine at the Blood Falls (McMurdo Dry Valleys, East Antarctica) for chemical and microbiological analysis. In our conference contribution, we 1) describe the IceMole design, 2) report the results of the field tests of the first prototype on the Morteratsch glacier, 3) discuss the probe’s potential for the clean in-situ analysis and sampling of subsurface ice and subglacial liquids, and 4) outline the way ahead in the development of this technology.
In this chapter, the key technologies and the instrumentation required for the subsurface exploration of ocean worlds are discussed. The focus is laid on Jupiter’s moon Europa and Saturn’s moon Enceladus because they have the highest potential for such missions in the near future. The exploration of their oceans requires landing on the surface, penetrating the thick ice shell with an ice-penetrating probe, and probably diving with an underwater vehicle through dozens of kilometers of water to the ocean floor, to have the chance to find life, if it exists. Technologically, such missions are extremely challenging. The required key technologies include power generation, communications, pressure resistance, radiation hardness, corrosion protection, navigation, miniaturization, autonomy, and sterilization and cleaning. Simpler mission concepts involve impactors and penetrators or – in the case of Enceladus – plume-fly-through missions.
We present the novel concept of a combined drilling and melting probe for subsurface ice research. This probe, named “IceMole”, is currently developed, built, and tested at the FH Aachen University of Applied Sciences’ Astronautical Laboratory. Here, we describe its first prototype design and report the results of its field tests on the Swiss Morteratsch glacier. Although the IceMole design is currently adapted to terrestrial glaciers and ice shields, it may later be modified for the subsurface in-situ investigation of extraterrestrial ice, e.g., on Mars, Europa, and Enceladus. If life exists on those bodies, it may be present in the ice (as life can also be found in the deep ice of Earth).
Evaluation of fragility curves for a three-storey-reinforced-concrete mock-up of SMART 2013 project
(2016)
The paper deals with the development of the probabilistic approach to the assessment of risk due to lightning. Sources of damage, types of damage and types of loss are defined and, accordingly, the procedure for risk analysis and the way of assessment of different risk components is proposed. The way to evaluate the influence of different protection measures (lightning protection system; shielding of structure, cables and equipment; routing of internal wiring; surge protective device) in reducing such probabilities is considered. The paper has been prepared within the framework of the activity of IEC TC81-WG9/CLC TC81-WG4 directed to prepare the draft IEC 62305-2 Risk Management, in cooperation with the Secretary of IEC/CLC TC81.
In this work, the effects of carbon sources and culture media on the production and structural properties of bacterial cellulose (BC) synthesized by Medusomyces gisevii have been studied. The culture medium was composed of different initial concentrations of glucose or sucrose dissolved in 0.4% extract of plain green tea. Parameters of the culture media (titratable acidity, substrate conversion degree etc.) were monitored daily for 20 days of cultivation. The BC pellicles produced on different carbon sources were characterized in terms of biomass yield, crystallinity and morphology by field emission scanning electron microscopy (FE-SEM), atomic force microscopy and X-ray diffraction. Our results showed that Medusomyces gisevii had higher BC yields in media with sugar concentrations close to 10 g L−1 after a 18–20 days incubation period. Glucose in general lead to a higher BC yield (173 g L−1) compared to sucrose (163.5 g L−1). The BC crystallinity degree and surface roughness were higher in the samples synthetized from sucrose. Obtained FE-SEM micrographs show that the BC pellicles synthesized in the sucrose media contained densely packed tangles of cellulose fibrils whereas the BC produced in the glucose media displayed rather linear geometry of the BC fibrils without noticeable aggregates.
A melting probe equipped with autofluorescence-based detection system combined with a light scattering unit, and, optionally, with a microarray chip would be ideally suited to probe icy environments like Europa’s ice layer as well as the polar ice layers of Earth and Mars for recent and extinct live.
Mechanical stimulation of the cells resulted in evident changes in the cell morphology, protein composition and gene expression. Microscopically, additional formation of stress fibers accompanied by cell re-arrangements in a monolayer was observed. Also, significant activation of p53 gene was revealed as compared to control. Interestingly, the use of CellTech membrane coating induced cell death after mechanical stress had been applied. Such an effect was not detected when fibronectin had been used as an adhesion substrate.
Recently, SHARP corporation has developed the world’s first “Plasma Cluster Ions (PCI)” air purification technology, which uses plasma discharge to generate cluster ions. The new plasma cluster device releases into the air positive and negative ions, which are harmless to humans and are able to decompose and deactivate airborne substances by chemical reactions. A lot of phenomenological tests of the PCI air purification technology on microbial cells have been conducted. And, in most cases, it has been shown that PCI demonstrate strongly pronounced killing effect. Although, the particular mechanisms of PCI action are still not evident. We studied variations in resistance to PCI among gram-positive airborne microorganisms, as well as some dose-dependent, spatial, cultural and biochemical properties of PCI action in respect of Staphylococcus spp, Enterococcus spp, Micrococcus spp.
Summary and Conclusions PCIs were clearly effective in terms of their antibacterial effects with the strains tested. This efficacy increased with the time the bacteries were exposed to PCIs. The bactericidal action has proved to be irreversible. PCIs were significantly less effective in shadowed areas. PCI exposure caused multiple protein damages as observed in SDS PAGE studies. There was no single but multiple molecular mechanism causing the bacterial death.
Recently, the SHARP Corporation, Japan, has developed the world’s first "Plasma Cluster Ions (PCI)" air purification technology using plasma discharge to generate cluster ions. The new plasma cluster device releases positive and negative ions into the air, which are able to decompose and deactivate harmful airborne substances by chemical reactions. Because cluster ions consist of positive and negative ions that normally exist in the natural world, they are completely harmless and safe to humans. The amount of ozone generated by cluster ions is less than 0.01 ppm, which is significantly less than the 0.05-ppm standard for industrial operations and consumer electronics. This amount, thus, has no harming effects whatsoever on the human body. But particular properties and chemical processes in PCI treatment are still under study. It has been shown that PCI in most cases show strongly pronounced irreversible killing effects in respect of airborne microflora due to free-radical induced reactions and can be considered as a potent technology to disinfect both home, medical and industrial appliances.
Recently, SHARP corporation has developed the world’s first "Plasma Cluster Ions® (PCI)" air purification technology, which uses plasma discharge to generate cluster ions. The new Plasma Cluster Device releases positive and negative ions into the air, which are harmless to humans and are able to decompose and deactivate airborne substances by chemical reactions. In the past, phenomenological tests on the efficacy of the PCI air purification technology on microbial cells have been conducted. In most cases, it has been shown that PCI demonstrated strongly pronounced killing effects on microorganisms. However, the particular mechanisms of PCI action still have to be uncovered.
The “1. Stokes’ problem”, the “suddenly accelerated flat wall”, is the oldest application of the Navier-Stokes equations. Stokes’ solution of the “problem” does not comply with the mathematical theorem of Cauchy and Kowalewskaya on the “Uniqueness and Existence” of solutions of partial differential equations and violates the physical theorem of minimum entropy production/dissipation of the Thermodynamics of Irreversible Processes. The result includes very high local shear stresses and dissipation rates. That is of special interest for the theory of turbulent and mixed turbulent/laminar flow. A textbook solution of the “1. Stokes Problem” is the Couette flow, which has a constant sheer stress along a linear profile. A consequence is that the Navier-Stokes equations do not describe any S-shaped part of a turbulent profile found in any turbulent Couette experiment. The paper surveys arguments referring to that statement, concerning the history of >150 years. Contrary to this there is always a Navier-Stokes solution near the wall, observed by a linear part of the Couette profile. There a turbulent description (e.g. by the logarithmic law-of-the-wall) fails completely. That is explained by the minimum dissipation requirement together with the Couette feature τ = const. The local co-existence of a turbulent zone and a laminar zone near the wall is stable and observed also at high Reynolds-Numbers.
Analyzing thermodynamic non-equilibrium processes, like the laminar and turbulent fluid flow, the dissipation is a key parameter with a characteristic minimum condition. That is applied to characterize laminar and turbulent behaviour of the Couette flow, including its transition in both directions. The Couette flow is chosen as the only flow form with constant shear stress over the flow profile, being laminar, turbulent or both. The local dissipation defines quantitative and stable criteria for the transition and the existence of turbulence. There are basic results: The Navier Stokes equations cannot describe the experimental flow profiles of the turbulent Couette flow. But they are used to quantify the dissipation of turbulent fluctuation. The dissipation minimum requires turbulent structures reaching maximum macroscopic dimensions, describing turbulence as a “non-local” phenomenon. At the transition the Couette flow profiles and the shear stress change by a factor ≅ 5 due to a change of the “apparent” turbulent viscosity by a calculated factor ≅ 27. The resulting difference of the laminar and the turbulent profiles results in two different Reynolds numbers and different loci of transition, which are identified by calculation.
The minimum dissipation requirement of the thermodynamics of irreversible processes is applied to characterize the existence of laminar and non-laminar, and the co-existence of laminar and turbulent flow zones. Local limitations of the different zones and three different forms of transition are defined. For the Couette flow a non-local “corpuscular” flow mechanism explains the logarithmic law-of-the-wall, maximum turbulent dimensions and a value x=0,415 for the v. Kármán constant. Limitations of the logarithmic law near the wall and in the centre of the experiment are interpreted.
The number of electronic vehicles increase steadily while the space for extending the charging infrastructure is limited. In particular in urban areas, where parking spaces in attractive areas are famous, opportunities to setup new charging stations is very limited. This leads to an overload of some very attractive charging stations and an underutilization of less attractive ones. Against this background, the paper at hand presents the design of an e-vehicle reservation system that aims at distributing the utilization of the charging infrastructure, particularly in urban areas. By applying a design science approach, the requirements for a reservation-based utilization approach are elicited and a model for a suitable distribution approach and its instantiation are developed. The artefact is evaluated by simulating the distribution effects based on data of real charging station utilizations.
Wind energy represents the dominant share of renewable energies. The rotor blades of a wind turbine are typically made from composite material, which withstands high forces during rotation. The huge dimensions of the rotor blades complicate the inspection processes in manufacturing. The automation of inspection processes has a great potential to increase the overall productivity and to create a consistent reliable database for each individual rotor blade. The focus of this paper is set on the process of rotor blade inspection automation by utilizing an autonomous mobile manipulator. The main innovations include a novel path planning strategy for zone-based navigation, which enables an intuitive right-hand or left-hand driving behavior in a shared human–robot workspace. In addition, we introduce a new method for surface orthogonal motion planning in connection with large-scale structures. An overall execution strategy controls the navigation and manipulation processes of the long-running inspection task. The implemented concepts are evaluated in simulation and applied in a real-use case including the tip of a rotor blade form.
Close interrelations between sound and image are not a mere phenomenon of today’s multimedia technology. The idea of the synthesis of different media lies at the core of the concept of the Gesamtkunstwerk in the second half of the 19th century and it can also be traced back to the synaesthesia debate at the beginning of the 20th century [...].
Subtilisins from microbial sources, especially from the Bacillaceae family, are of particular interest for biotechnological applications and serve the currently growing enzyme market as efficient and novel biocatalysts. Biotechnological applications include use in detergents, cosmetics, leather processing, wastewater treatment and pharmaceuticals. To identify a possible candidate for the enzyme market, here we cloned the gene of the subtilisin SPFA from Fictibacillus arsenicus DSM 15822ᵀ (obtained through a data mining-based search) and expressed it in Bacillus subtilis DB104. After production and purification, the protease showed a molecular mass of 27.57 kDa and a pI of 5.8. SPFA displayed hydrolytic activity at a temperature optimum of 80 °C and a very broad pH optimum between 8.5 and 11.5, with high activity up to pH 12.5. SPFA displayed no NaCl dependence but a high NaCl tolerance, with decreasing activity up to concentrations of 5 m NaCl. The stability enhanced with increasing NaCl concentration. Based on its substrate preference for 10 synthetic peptide 4-nitroanilide substrates with three or four amino acids and its phylogenetic classification, SPFA can be assigned to the subgroup of true subtilisins. Moreover, SPFA exhibited high tolerance to 5% (w/v) SDS and 5% H₂O₂ (v/v). The biochemical properties of SPFA, especially its tolerance of remarkably high pH, SDS and H₂O₂, suggest it has potential for biotechnological applications.
Halophilic and halotolerant microorganisms represent a promising source of salt-tolerant enzymes suitable for various biotechnological applications where high salt concentrations would otherwise limit enzymatic activity. Considering the current growing enzyme market and the need for more efficient and new biocatalysts, the present study aimed at the characterization of a high-alkaline subtilisin from Alkalihalobacillus okhensis Kh10-101T. The protease gene was cloned and expressed in Bacillus subtilis DB104. The recombinant protease SPAO with 269 amino acids belongs to the subfamily of high-alkaline subtilisins. The biochemical characteristics of purified SPAO were analyzed in comparison with subtilisin Carlsberg, Savinase, and BPN'. SPAO, a monomer with a molecular mass of 27.1 kDa, was active over a wide range of pH 6.0–12.0 and temperature 20–80 °C, optimally at pH 9.0–9.5 and 55 °C. The protease is highly oxidatively stable to hydrogen peroxide and retained 58% of residual activity when incubated at 10 °C with 5% (v/v) H2O2 for 1 h while stimulated at 1% (v/v) H2O2. Furthermore, SPAO was very stable and active at NaCl concentrations up to 5.0 m. This study demonstrates the potential of SPAO for biotechnological applications in the future.
The aim of the present study was the characterisation of three true subtilisins and one phylogenetically intermediate subtilisin from halotolerant and halophilic microorganisms. Considering the currently growing enzyme market for efficient and novel biocatalysts, data mining is a promising source for novel, as yet uncharacterised enzymes, especially from halophilic or halotolerant Bacillaceae, which offer great potential to meet industrial needs. Both halophilic bacteria Pontibacillus marinus DSM 16465ᵀ and Alkalibacillus haloalkaliphilus DSM 5271ᵀ and both halotolerant bacteria Metabacillus indicus DSM 16189 and Litchfieldia alkalitelluris DSM 16976ᵀ served as a source for the four new subtilisins SPPM, SPAH, SPMI and SPLA. The protease genes were cloned and expressed in Bacillus subtilis DB104. Purification to apparent homogeneity was achieved by ethanol precipitation, desalting and ion-exchange chromatography. Enzyme activity could be observed between pH 5.0–12.0 with an optimum for SPPM, SPMI and SPLA around pH 9.0 and for SPAH at pH 10.0. The optimal temperature for SPMI and SPLA was 70 °C and for SPPM and SPAH 55 °C and 50 °C, respectively. All proteases showed high stability towards 5% (w/v) SDS and were active even at NaCl concentrations of 5 M. The four proteases demonstrate potential for future biotechnological applications.
The main objective of our ROS Summer School series is to introduce MA level students to program mobile robots with the Robot Operating System (ROS). ROS is a robot middleware that is used my many research institutions world-wide. Therefore, many state-of-the-art algorithms of mobile robotics are available in ROS and can be deployed very easily. As a basic robot platform we deploy a 1/10 RC cart that is wquipped with an Arduino micro-controller to control the servo motors, and an embedded PC that runs ROS. In two weeks, participants get to learn the basics of mobile robotics hands-on. We describe our teaching concepts and our curriculum and report on the learning success of our students.
For short take-off and landing (STOL) aircraft, a parallel hybrid-electric propulsion system potentially offers superior performance compared to a conventional propulsion system, because the short-take-off power requirement is much higher than the cruise power requirement. This power-matching problem can be solved with a balanced hybrid propulsion system. However, there is a trade-off between wing loading, power loading, the level of hybridization, as well as range and take-off distance. An optimization method can vary design variables in such a way that a minimum of a particular objective is attained. In this paper, a comparison between the optimization results for minimum mass, minimum consumed primary energy, and minimum cost is conducted. A new initial sizing algorithm for general aviation aircraft with hybrid-electric propulsion systems is applied. This initial sizing methodology covers point performance, mission performance analysis, the weight estimation process, and cost estimation. The methodology is applied to the design of a STOL general aviation aircraft, intended for on-demand air mobility operations. The aircraft is sized to carry eight passengers over a distance of 500 km, while able to take off and land from short airstrips. Results indicate that parallel hybrid-electric propulsion systems must be considered for future STOL aircraft.
On 1st January 1998, the German telecom market was fully liberalised. Since then genuine competition between market participants has developed, based on a comprehensive legal and regulatory framework that provides for safeguards against unfair competition and market power by Deutsche Telekom. Today, about 10 years after the liberalisation of the telecommunications sector a revision of this regulatory approach has become necessary because at least on three dimensions the situation is quite different from the one 10 years ago: First, with numerous established alternative operators in the market monopolies have been successfully challenged and competition introduced. Second, not only is Cable TV becoming in large parts of Germany a viable alternative for the provision of broadband services but also mobile services are becoming increasingly a substitute for fixed services. Last but not least there are important technological changes under way, requiring huge investments in infrastructure upgrades for next generation networks. In the light of these new developments the question is to which extent the current regulatory approach of severe ex-ante regulatory intervention is still appropriate. Is any part of the network of the former incumbent still a bottleneck? A more light handed regulatory approach might be the right response to this new situation. The paper is organised as follows: The first section will briefly examine the economic rationale for regulating network access. Based on the assumption that regulation is always necessary when bottlenecks exist regulatory principles for an efficient network access regime will be derived. The second section compares the situation of the German market in early 1998 with the one of today. Thereby three dimensions will be considered: the degree of competition, the potential for substitution and technological developments. The third section will define some requirements for the future regulation of telecom markets. Proposals will be elaborated how to ensure competitive telecom markets in the light of new economic and technological challenges.
Working paper distributed at 2nd Annual Next Generation Telecommunications Conference 2009, 13th – 14th October 2009, Brussels 14 pages Abstract Governments all over Europe are in the process of adopting new broadband strategies. The objective is to create modern telecommunications networks based on powerful broadband infrastructures". In doing so, they aim for innovative and investment-friendly concepts. For instance, in a recently published consultation paper on the subject the German regulator BNetzA declared that it will take “greater account of … reducing risks, securing the investment and innovation power, providing planning certainty and transparency – in order to support and advance broadband rollout in Germany”. It further states that when regulating wholesale rates it has to be ensured that “… adequate incentives for network rollout are provided on the one hand, while sustainable and fair competition is ensured on the other”. Also an EC draft recommendation on regulated network access is about to set new standards for the regulation of next generation access networks. According to the recommendation the prices of new assets shall be based on costs plus a projectspecific risk premium to be included in the costs of capital for the investment risk incurred by the operator. This approach has been criticised from various sides. In particular it has been questioned whether such an approach is adequate to meet the objectives of encouraging both competition and investment into next generation access networks. Against this background, the concept of “long term risk sharing contracts” has been proposed recently as an approach which does not only incorporate the various additional risks involved in the deployment of NGA infrastructure, but has several other advantages. This paper will demonstrate that the concept allows for competition to evolve at both the retail and wholesale level on fair, objective, non-discriminatory and transparent terms and conditions. Moreover, it ensures the highest possible investment incentive in line with socially desirable outcome. The paper is organised as follows: The next section will briefly outline the importance of encouraging competition and investment in an NGA-environment. The third section will specify the design of long term risk sharing contracts in view of achieving these objectives. The fourth section will examine potential problems associated with the concept. In doing so a way of how to deal with them will be elaborated. The last section will look at arguments against long term risk sharing contracts. It will be shown that these arguments are not strong enough to build a case against introducing such contracts.