Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (124)
- IfB - Institut für Bioengineering (60)
- Fachbereich Energietechnik (27)
- INB - Institut für Nano- und Biotechnologien (24)
- Fachbereich Maschinenbau und Mechatronik (17)
- Fachbereich Luft- und Raumfahrttechnik (16)
- Fachbereich Wirtschaftswissenschaften (15)
- Fachbereich Chemie und Biotechnologie (14)
- Fachbereich Bauingenieurwesen (7)
- Fachbereich Elektrotechnik und Informationstechnik (7)
Has Fulltext
- yes (226) (remove)
Language
- English (226) (remove)
Document Type
- Conference Proceeding (132)
- Article (83)
- Lecture (5)
- Working Paper (4)
- Conference Poster (1)
- Talk (1)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Clusterion (5)
- shakedown analysis (5)
This study reviews the practice of brake tests in freight railways, which is time consuming and not suitable to detect certain failure types. Public incident reports are analysed to derive a reasonable brake test hardware and communication architecture, which aims to provide automatic brake tests at lower cost than current solutions. The proposed solutions relies exclusively on brake pipe and brake cylinder pressure sensors, a brake release position switch as well as radio communication via standard protocols. The approach is embedded in the Wagon 4.0 concept, which is a holistic approach to a smart freight wagon. The reduction of manual processes yields a strong incentive due to high savings in manual
labour and increased productivity.
This study focuses on thermoelectric elements (TEE) as an alternative for room temperature control. TEE are semi-conductor devices that can provide heating and cooling via a heat pump effect without direct noise emissions and no refrigerant use. An efficiency evaluation of the optimal operating mode is carried out for different numbers of TEE, ambient temperatures, and heating loads. The influence of an additional heat recovery unit on system efficiency and an unevenly distributed heating demand are examined. The results show that TEE can provide heat at a coefficient of performance (COP) greater than one especially for small heating demands and high ambient temperatures. The efficiency increases with the number of elements in the system and is subject to economies of scale. The best COP exceeds six at optimal operating conditions. An additional heat recovery unit proves beneficial for low ambient temperatures and systems with few TEE. It makes COPs above one possible at ambient temperatures below 0 ∘C. The effect increases efficiency by maximal 0.81 (from 1.90 to 2.71) at ambient temperature 5 K below room temperature and heating demand Q˙h=100W but is subject to diseconomies of scale. Thermoelectric technology is a valuable option for electricity-based heat supply and can provide cooling and ventilation functions. A careful system design as well as an additional heat recovery unit significantly benefits the performance. This makes TEE superior to direct current heating systems and competitive to heat pumps for small scale applications with focus on avoiding noise and harmful refrigerants.
Wind energy represents the dominant share of renewable energies. The rotor blades of a wind turbine are typically made from composite material, which withstands high forces during rotation. The huge dimensions of the rotor blades complicate the inspection processes in manufacturing. The automation of inspection processes has a great potential to increase the overall productivity and to create a consistent reliable database for each individual rotor blade. The focus of this paper is set on the process of rotor blade inspection automation by utilizing an autonomous mobile manipulator. The main innovations include a novel path planning strategy for zone-based navigation, which enables an intuitive right-hand or left-hand driving behavior in a shared human–robot workspace. In addition, we introduce a new method for surface orthogonal motion planning in connection with large-scale structures. An overall execution strategy controls the navigation and manipulation processes of the long-running inspection task. The implemented concepts are evaluated in simulation and applied in a real-use case including the tip of a rotor blade form.
Microbial diversity studies regarding the aquatic communities that experienced or are experiencing environmental problems are essential for the comprehension of the remediation dynamics. In this pilot study, we present data on the phylogenetic and ecological structure of microorganisms from epipelagic water samples collected in the Small Aral Sea (SAS). The raw data were generated by massive parallel sequencing using the shotgun approach. As expected, most of the identified DNA sequences belonged to Terrabacteria and Actinobacteria (40% and 37% of the total reads, respectively). The occurrence of Deinococcus-Thermus, Armatimonadetes, Chloroflexi in the epipelagic SAS waters was less anticipated. Surprising was also the detection of sequences, which are characteristic for strict anaerobes—Ignavibacteria, hydrogen-oxidizing bacteria, and archaeal methanogenic species. We suppose that the observed very broad range of phylogenetic and ecological features displayed by the SAS reads demonstrates a more intensive mixing of water masses originating from diverse ecological niches of the Aral-Syr Darya River basin than presumed before.
We consider a binary multivariate regression model where the conditional expectation of a binary variable given a higher-dimensional input variable belongs to a parametric family. Based on this, we introduce a model-based bootstrap (MBB) for higher-dimensional input variables. This test can be used to check whether a sequence of independent and identically distributed observations belongs to such a parametric family. The approach is based on the empirical residual process introduced by Stute (Ann Statist 25:613–641, 1997). In contrast to Stute and Zhu’s approach (2002) Stute & Zhu (Scandinavian J Statist 29:535–545, 2002), a transformation is not required. Thus, any problems associated with non-parametric regression estimation are avoided. As a result, the MBB method is much easier for users to implement. To illustrate the power of the MBB based tests, a small simulation study is performed. Compared to the approach of Stute & Zhu (Scandinavian J Statist 29:535–545, 2002), the simulations indicate a slightly improved power of the MBB based method. Finally, both methods are applied to a real data set.
Extension fractures are typical for the deformation under low or no confining pressure. They can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. In this article, it is shown that the simple extension strain criterion makes unrealistic strength predictions in biaxial compression and tension. To overcome this major limitation, a new extension strain criterion is proposed by adding a weighted principal shear component to the simple criterion. The shear weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting extension failure modes, which are unexpected in the classical understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain leading to dilatancy. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak stress CP. Different from compressive loading, tensile loading requires only a limited number of critical cracks to cause failure. Therefore, for tensile stresses, the failure criteria must be modified somehow, possibly by a cut-off corresponding to the CI stress. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Previous studies optimized the dimensions of coaxial heat exchangers using constant mass fow rates as a boundary condition. They show a thermal optimal circular ring width of nearly zero. Hydraulically optimal is an inner to outer pipe radius ratio of 0.65 for turbulent and 0.68 for laminar fow types. In contrast, in this study, fow conditions in the circular ring are kept constant (a set of fxed Reynolds numbers) during optimization. This approach ensures fxed fow conditions and prevents inappropriately high or low mass fow rates. The optimization is carried out for three objectives: Maximum energy gain, minimum hydraulic efort and eventually optimum net-exergy balance. The optimization changes the inner pipe radius and mass fow rate but not the Reynolds number of the circular ring. The thermal calculations base on Hellström’s borehole resistance and the hydraulic optimization on individually calculated linear loss of head coefcients. Increasing the inner pipe radius results in decreased hydraulic losses in the inner pipe but increased losses in the circular ring. The net-exergy diference is a key performance indicator and combines thermal and hydraulic calculations. It is the difference between thermal exergy fux and hydraulic efort. The Reynolds number in the circular ring is instead of the mass fow rate constant during all optimizations. The result from a thermal perspective is an optimal width of the circular ring of nearly zero. The hydraulically optimal inner pipe radius is 54% of the outer pipe radius for laminar fow and 60% for turbulent fow scenarios. Net-exergetic optimization shows a predominant infuence of hydraulic losses, especially for small temperature gains. The exact result depends on the earth’s thermal properties and the fow type. Conclusively, coaxial geothermal probes’ design should focus on the hydraulic optimum and take the thermal optimum as a secondary criterion due to the dominating hydraulics.
The paper presents the derivation of a new equivalent skin friction coefficient for estimating the parasitic drag of short-to-medium range fixed-wing unmanned aircraft. The new coefficient is derived from an aerodynamic analysis of ten different unmanned aircraft used for surveillance, reconnaissance, and search and rescue missions. The aircraft is simulated using a validated unsteady Reynolds-averaged Navier Stokes approach. The UAV’s parasitic drag is significantly influenced by the presence of miscellaneous components like fixed landing gears or electro-optical sensor turrets. These components are responsible for almost half of an unmanned aircraft’s total parasitic drag. The new equivalent skin friction coefficient accounts for these effects and is significantly higher compared to other aircraft categories. It is used to initially size an unmanned aircraft for a typical reconnaissance mission. The improved parasitic drag estimation yields a much heavier unmanned aircraft when compared to the sizing results using available drag data of manned aircraft.
Reliable automation of the labor-intensive manual task of scoring animal sleep can facilitate the analysis of long-term sleep studies. In recent years, deep-learning-based systems, which learn optimal features from the data, increased scoring accuracies for the classical sleep stages of Wake, REM, and Non-REM. Meanwhile, it has been recognized that the statistics of transitional stages such as pre-REM, found between Non-REM and REM, may hold additional insight into the physiology of sleep and are now under vivid investigation. We propose a classification system based on a simple neural network architecture that scores the classical stages as well as pre-REM sleep in mice. When restricted to the classical stages, the optimized network showed state-of-the-art classification performance with an out-of-sample F1 score of 0.95 in male C57BL/6J mice. When unrestricted, the network showed lower F1 scores on pre-REM (0.5) compared to the classical stages. The result is comparable to previous attempts to score transitional stages in other species such as transition sleep in rats or N1 sleep in humans. Nevertheless, we observed that the sequence of predictions including pre-REM typically transitioned from Non-REM to REM reflecting sleep dynamics observed by human scorers. Our findings provide further evidence for the difficulty of scoring transitional sleep stages, likely because such stages of sleep are under-represented in typical data sets or show large inter-scorer variability. We further provide our source code and an online platform to run predictions with our trained network.
Quantitative nuclear magnetic resonance (qNMR) is considered as a powerful tool for multicomponent mixture analysis as well as for the purity determination of single compounds. Special attention is currently paid to the training of operators and study directors involved in qNMR testing. To assure that only qualified personnel are used for sample preparation at our GxP-accredited laboratory, weighing test was proposed. Sixteen participants performed six-fold weighing of the binary mixture of dibutylated hydroxytoluene (BHT) and 1,2,4,5-tetrachloro-3-nitrobenzene (TCNB). To evaluate the quality of data analysis, all spectra were evaluated manually by a qNMR expert and using in-house developed automated routine. The results revealed that mean values are comparable and both evaluation approaches are free of systematic error. However, automated evaluation resulted in an approximately 20% increase in precision. The same findings were revealed for qNMR analysis of 32 compounds used in pharmaceutical industry. Weighing test by six-fold determination in binary mixtures and automated qNMR methodology can be recommended as efficient tools for evaluating staff proficiency. The automated qNMR method significantly increases throughput and precision of qNMR for routine measurements and extends application scope of qNMR.
The treatment method to deactivate viable microorganisms from objects or products is termed sterilization. There are multiple forms of sterilization, each intended to be applied for a specific target, which depends on—but not limited to—the thermal, physical, and chemical stability of that target. Herein, an overview on the currently used sterilization processes in the global market is provided. Different sterilization techniques are grouped under a category that describes the method of treatment: radiation (gamma, electron beam, X-ray, and ultraviolet), thermal (dry and moist heat), and chemical (ethylene oxide, ozone, chlorine dioxide, and hydrogen peroxide). For each sterilization process, the typical process parameters as defined by regulations and the mode of antimicrobial activity are summarized. Finally, the recommended microorganisms that are used as biological indicators to validate sterilization processes in accordance with the rules that are established by various regulatory agencies are summarized.
An acetoin biosensor based on a capacitive electrolyte–insulator–semiconductor (EIS) structure modified with the enzyme acetoin reductase, also known as butane-2,3-diol dehydrogenase (Bacillus clausii DSM 8716ᵀ), is applied for acetoin detection in beer, red wine, and fermentation broth samples for the first time. The EIS sensor consists of an Al/p-Si/SiO₂/Ta₂O₅ layer structure with immobilized acetoin reductase on top of the Ta₂O₅ transducer layer by means of crosslinking via glutaraldehyde. The unmodified and enzyme-modified sensors are electrochemically characterized by means of leakage current, capacitance–voltage, and constant capacitance methods, respectively.
This paper compares several blade element theory (BET) method-based propeller simulation tools, including an evaluation against static propeller ground tests and high-fidelity Reynolds-Average Navier Stokes (RANS) simulations. Two proprietary propeller geometries for paraglider applications are analysed in static and flight conditions. The RANS simulations are validated with the static test data and used as a reference for comparing the BET in flight conditions. The comparison includes the analysis of varying 2D aerodynamic airfoil parameters and different induced velocity calculation methods. The evaluation of the BET propeller simulation tools shows the strength of the BET tools compared to RANS simulations. The RANS simulations underpredict static experimental data within 10% relative error, while appropriate BET tools overpredict the RANS results by 15–20% relative error. A variation in 2D aerodynamic data depicts the need for highly accurate 2D data for accurate BET results. The nonlinear BET coupled with XFOIL for the 2D aerodynamic data matches best with RANS in static operation and flight conditions. The novel BET tool PropCODE combines both approaches and offers further correction models for highly accurate static and flight condition results.
In traditional microbial biobutanol production, the solvent must be recovered during fermentation process for a sufficient space-time yield. Thermal separation is not feasible due to the boiling point of n-butanol. As an integrated and selective solid-liquid separation alternative, solvent impregnated resins (SIRs) were applied. Two polymeric resins were evaluated and an extractant screening was conducted. Vacuum application with vapor collection in fixed-bed column as bioreactor bypass was successfully implemented as butanol desorption step. In course of further increasing process economics, fermentation with renewable lignocellulosic substrates was conducted using Clostridium acetobutylicum. Utilization of SIR was shown to be a potential strategy for solvent removal from fermentation broth, while application of a bypass column allows for product removal and recovery at once.
In this chapter, the key technologies and the instrumentation required for the subsurface exploration of ocean worlds are discussed. The focus is laid on Jupiter’s moon Europa and Saturn’s moon Enceladus because they have the highest potential for such missions in the near future. The exploration of their oceans requires landing on the surface, penetrating the thick ice shell with an ice-penetrating probe, and probably diving with an underwater vehicle through dozens of kilometers of water to the ocean floor, to have the chance to find life, if it exists. Technologically, such missions are extremely challenging. The required key technologies include power generation, communications, pressure resistance, radiation hardness, corrosion protection, navigation, miniaturization, autonomy, and sterilization and cleaning. Simpler mission concepts involve impactors and penetrators or – in the case of Enceladus – plume-fly-through missions.
For short take-off and landing (STOL) aircraft, a parallel hybrid-electric propulsion system potentially offers superior performance compared to a conventional propulsion system, because the short-take-off power requirement is much higher than the cruise power requirement. This power-matching problem can be solved with a balanced hybrid propulsion system. However, there is a trade-off between wing loading, power loading, the level of hybridization, as well as range and take-off distance. An optimization method can vary design variables in such a way that a minimum of a particular objective is attained. In this paper, a comparison between the optimization results for minimum mass, minimum consumed primary energy, and minimum cost is conducted. A new initial sizing algorithm for general aviation aircraft with hybrid-electric propulsion systems is applied. This initial sizing methodology covers point performance, mission performance analysis, the weight estimation process, and cost estimation. The methodology is applied to the design of a STOL general aviation aircraft, intended for on-demand air mobility operations. The aircraft is sized to carry eight passengers over a distance of 500 km, while able to take off and land from short airstrips. Results indicate that parallel hybrid-electric propulsion systems must be considered for future STOL aircraft.
A light-addressable potentiometric sensor (LAPS) is a field-effect-based (bio-) chemical sensor, in which a desired sensing area on the sensor surface can be defined by illumination. Light addressability can be used to visualize the concentration and spatial distribution of the target molecules, e.g., H+ ions. This unique feature has great potential for the label-free imaging of the metabolic activity of living organisms. The cultivation of those organisms needs specially tailored surface properties of the sensor. O2 plasma treatment is an attractive and promising tool for rapid surface engineering. However, the potential impacts of the technique are carefully investigated for the sensors that suffer from plasma-induced damage. Herein, a LAPS with a Ta2O5 pH-sensitive surface is successfully patterned by plasma treatment, and its effects are investigated by contact angle and scanning LAPS measurements. The plasma duration of 30 s (30 W) is found to be the threshold value, where excessive wettability begins. Furthermore, this treatment approach causes moderate plasma-induced damage, which can be reduced by thermal annealing (10 min at 300 °C). These findings provide a useful guideline to support future studies, where the LAPS surface is desired to be more hydrophilic by O2 plasma treatment.
Recent analysis of scientific data from Cassini and earth-based observations gave evidence for a global ocean under a surrounding solid ice shell on Saturn's moon Enceladus. Images of Enceladus' South Pole showed several fissures in the ice shell with plumes constantly exhausting frozen water particles, building up the E-Ring, one of the outer rings of Saturn. In this southern region of Enceladus, the ice shell is considered to be as thin as 2 km, about an order of magnitude thinner than on the rest of the moon. Under the ice shell, there is a global ocean consisting of liquid water. Scientists are discussing different approaches the possibilities of taking samples of water, i.e. by melting through the ice using a melting probe. FH Aachen UAS developed a prototype of maneuverable melting probe which can navigate through the ice that has already been tested successfully in a terrestrial environment. This means no atmosphere and or ambient pressure, low ice temperatures of around 100 to 150K (near the South Pole) and a very low gravity of 0,114 m/s^2 or 1100 μg. Two of these influencing measures are about to be investigated at FH Aachen UAS in 2017, low ice temperature and low ambient pressure below the triple point of water. Low gravity cannot be easily simulated inside a large experiment chamber, though. Numerical simulations of the melting process at RWTH Aachen however are showing a gravity dependence of melting behavior. Considering this aspect, VIPER provides a link between large-scale experimental simulations at FH Aachen UAS and numerical simulations at RWTH Aachen. To analyze the melting process, about 90 seconds of experiment time in reduced gravity and low ambient pressure is provided by the REXUS rocket. In this time frame, the melting speed and contact force between ice and probes are measured, as well as heating power and a two-dimensional array of ice temperatures. Additionally, visual and infrared cameras are used to observe the melting process.
Electromechanical model of hiPSC-derived ventricular cardiomyocytes cocultured with fibroblasts
(2018)
The CellDrum provides an experimental setup to study the mechanical effects of fibroblasts co-cultured with hiPSC-derived ventricular cardiomyocytes. Multi-scale computational models based on the Finite Element Method are developed. Coupled electrical cardiomyocyte-fibroblast models (cell level) are embedded into reaction-diffusion equations (tissue level) which compute the propagation of the action potential in the cardiac tissue. Electromechanical coupling is realised by an excitation-contraction model (cell level) and the active stress arising during contraction is added to the passive stress in the force balance, which determines the tissue displacement (tissue level). Tissue parameters in the model can be identified experimentally to the specific sample.
Research collaborations provide opportunities for both practitioners and researchers: practitioners need solutions for difficult business challenges and researchers are looking for hard problems to solve and publish. Nevertheless, research collaborations carry the risk that practitioners focus on quick solutions too much and that researchers tackle theoretical problems, resulting in products which do not fulfill the project requirements.
In this paper we introduce an approach extending the ideas of agile and lean software development. It helps practitioners and researchers keep track of their common research collaboration goal: a scientifically enriched software product which fulfills the needs of the practitioner’s business model.
This approach gives first-class status to application-oriented metrics that measure progress and success of a research collaboration continuously. Those metrics are derived from the collaboration requirements and help to focus on a commonly defined goal.
An appropriate tool set evaluates and visualizes those metrics with minimal effort, and all participants will be pushed to focus on their tasks with appropriate effort. Thus project status, challenges and progress are transparent to all research collaboration members at any time.
The Passivhaus building standard is a concept developed for the realization of energy-efficient and economical buildings with a simultaneous high utilization comfort under European climate conditions. Major elements of the Passivhaus concept are a high thermal insulation of the external walls, the use of heat and/or solar shading glazing as well as an airtight building envelope in combination with energy-efficient technical building installations and heating or cooling generators, such as an efficient energy-recovery in the building air-conditioning. The objective of this research project is the inquiry to determine the parameters or constraints under which the Passivhaus concept can be implemented under the arid climate conditions in the Arabian Peninsula to achieve an energy-efficient and economical building with high utilization comfort. In cooperation between the Qatar Green Building Council (QGBC), Barwa Real Estate (BRE) and Kahramaa the first Passivhaus was constructed in Qatar and on the Arabian Peninsula in 2013. The Solar-Institut Jülich of Aachen University of Applied Science supports the Qatar Green Building Council with a dynamic building and equipment simulation of the Passivhaus and the neighbouring reference building. This includes simulation studies with different component configurations for the building envelope and different control strategies for heating or cooling systems as well as the air conditioning of buildings to find an energetic-economical optimum. Part of these analyses is the evaluation of the energy efficiency of the used energy recovery system in the Passivhaus air-conditioning and identification of possible energy-saving effects by the use of a bypass function integrated in the heat exchanger. In this way it is expected that on an annual basis the complete electricity demand of the building can be covered by the roof-integrated PV generator.
To give the exchange of goods and services between the European Union (EU) and the United States (U.S.) new momentum the two parties are currently negotiating the transatlantic free trade agreement Transatlantic Trade and Investment Partnership (TTIP). The aim is to create the largest free trade area in the world. The agreement, once entered into force, will oblige EU countries and the U.S. to further liberalize their markets.
The negotiations on TTIP include a chapter on Electronic Communications/ Telecommunications. The challenge therein will be securing commitments for market access to Electronic Communications services. At the same time, these commitments must reflect the legitimate need for consumer protection issues. The need to reduce Electronic Communications-related non-tariff barriers to trade between the Parties is due to the fact that these markets are heavily regulated. Without transnational rules as to regulations national governments can abuse these regulations to deter the market entry by new (foreign) suppliers. Thus the free trade agreement TTIP affects in many respects regulatory provisions on and access to Electronic Communications markets. The objective of this paper is therefore to examine to what extend the regulatory principles for Electronic Communications markets envisaged under TTIP will result in trade facilitation and regulatory convergence between the EU and the U.S.
As to this question the result of the analysis is that the chapter on Electronic Communications will be an important step towards facilitating trade in Electronic Communications services. At the same time some regulatory convergence will take place, but this convergence will not lead to a (full) harmonization of regulations. Rather the norm, also after TTIP negotiations will have been concluded successfully, will be mutual recognition of different regulatory regimes. Different regulations being the optimal policy response in different market settings will continue to exist. Moreover, it is very unlikely that such regulatory principles for the Electronic Communications sector are a vehicle for a race to the bottom in levels of consumer protection.
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.
Evaluation of fragility curves for a three-storey-reinforced-concrete mock-up of SMART 2013 project
(2016)
Unsteady flow measurements in the wake behind a wind-tunnel car model by using high-speed planar PIV
(2015)
This study investigates unsteady characteristics of the wake behind a 28%-scale car model in a wind tunnel using highspeed planar particle image velocimetry (PIV). The car model is based on a hatchback passenger car that is known to have relatively high fluctuations in its aerodynamic loads. This study primarily focuses on the lateral motion of the flow on the horizontal plane to determine the effect of the flow motion on the straight-line stability and the initial steering response of the actual car on a track. This paper first compares the flow fields in the wake behind the above mentioned model obtained using conventional and high-speed planar PIV, with sampling frequencies of 8 Hz and 1 kHz, respectively. Large asymmetrically coherent flow structures, which fluctuate at frequencies below 2 Hz, are observed in the results of highspeed PIV measurements, whereas conventional PIV is unable to capture these features of the flow owing to aliasing. This flow pattern with a laterally swaying motion is represented by opposite signs of cross-correlation coefficients of streamwise velocity fluctuations for the two sides of the car model. Effects of two aerodynamic devices that are known to reduce the
fluctuation levels of the aerodynamic loads are then extensively investigated. The correlation analyses reveal that these devices indeed reduce the fluctuation levels of the flow and the correlation values around the rear combination-lamp, but it is found that the effects of these devices are different around the c-pillar.
The Scarab Project
(2015)
Urban Search and Rescue (USAR) is an active research
field in the robotics community. Despite recent advances
for many open research questions, these kind of systems are
not widely used in real rescue missions. One reason is that such
systems are complex and not (yet) very reliable; another is that
one has to be an robotic expert to run such a system. Moreover,
available rescue robots are very expensive and the benefits of
using them are still limited.
In this paper, we present the Scarab robot, an alternative
design for a USAR robot. The robot is light weight, humanpackable
and its primary purpose is that of extending the
rescuer’s capability to sense the disaster site. The idea is that a
responder throws the robot to a certain spot. The robot survives
the impact with the ground and relays sensor data such as
camera images or thermal images to the responder’s hand-held
control unit from which the robot can be remotely controlled.
Mechatronics consist of the integration of mechanical
engineering, electronic integration and computer science/
engineering. These broad fields are essential for robotic
systems, yet it makes it difficult for the researchers to specialize
and be experts in all these fields. Collaboration between
researchers allow for the integration of experience and specialization,
to allow optimized systems. Collaboration between the
European countries and South Africa is critical, as each country
has different resources available, which the other countries
might not have. Applications with the need for approval of
any restrictions, can also be obtained easier in some countries
compared to others, thus preventing the delays of research.
Some problems that have been experienced are discussed, with
the Robotics Center of South Africa as a possible solution.
With autonomous mobile robots receiving increased
attention in industrial contexts, the need for benchmarks
becomes more and more an urgent matter. The RoboCup
Logistics League (RCLL) is one specific industry-inspired scenario
focusing on production logistics within a Smart Factory.
In this paper, we describe how the RCLL allows to assess the
performance of a group of robots within the scenario as a
whole, focusing specifically on the coordination and cooperation
strategies and the methods and components to achieve them.
We report on recent efforts to analyze performance of teams in
2014 to understand the implications of the current grading
scheme, and derived criteria and metrics for performance
assessment based on Key Performance Indicators (KPI) adapted
from classic factory evaluation. We reflect on differences and
compatibility towards RoCKIn, a recent major benchmarking
European project.
The main objective of our ROS Summer School series is to introduce MA level students to program mobile robots with the Robot Operating System (ROS). ROS is a robot middleware that is used my many research institutions world-wide. Therefore, many state-of-the-art algorithms of mobile robotics are available in ROS and can be deployed very easily. As a basic robot platform we deploy a 1/10 RC cart that is wquipped with an Arduino micro-controller to control the servo motors, and an embedded PC that runs ROS. In two weeks, participants get to learn the basics of mobile robotics hands-on. We describe our teaching concepts and our curriculum and report on the learning success of our students.
A multi-functional device applying for the safe maintenance at high-altitude on wind turbines
(2015)
A 3D finite element model of the female pelvic floor for the reconstruction of urinary incontinence
(2014)
The determination of spacing, edge and end distance requirements for self-tapping screws requires numerous and comprehensive insertion tests. Yet the results of such tests cannot be transferred to other types of screws or even to screws of different diameter because of differences in shape or geometry. To reduce the effort of insertion tests a new method was developed which allows the estimation of required spacings, distances and timber thickness.
The carbonized rice husk (CRH) was evaluated for its wound healing activity in rats using excision models. In this study, the influences of CRH on wound healing in rat skin in vivo and cellular behavior of human dermal fibroblasts in vitro were investigated. The obtained results showed that the CRH treatment promoted wound epithelization in rats and exhibited moderate inhibition of cell proliferation in vitro. CRH with lanolin oil treated wounds were found to epithelize faster as compared to controls.
Recently, the SHARP Corporation, Japan, has developed the world’s first "Plasma Cluster Ions (PCI)" air purification technology using plasma discharge to generate cluster ions. The new plasma cluster device releases positive and negative ions into the air, which are able to decompose and deactivate harmful airborne substances by chemical reactions. Because cluster ions consist of positive and negative ions that normally exist in the natural world, they are completely harmless and safe to humans. The amount of ozone generated by cluster ions is less than 0.01 ppm, which is significantly less than the 0.05-ppm standard for industrial operations and consumer electronics. This amount, thus, has no harming effects whatsoever on the human body. But particular properties and chemical processes in PCI treatment are still under study. It has been shown that PCI in most cases show strongly pronounced irreversible killing effects in respect of airborne microflora due to free-radical induced reactions and can be considered as a potent technology to disinfect both home, medical and industrial appliances.
Summary and Conclusions PCIs were clearly effective in terms of their antibacterial effects with the strains tested. This efficacy increased with the time the bacteries were exposed to PCIs. The bactericidal action has proved to be irreversible. PCIs were significantly less effective in shadowed areas. PCI exposure caused multiple protein damages as observed in SDS PAGE studies. There was no single but multiple molecular mechanism causing the bacterial death.
Recently, SHARP corporation has developed the world’s first “Plasma Cluster Ions (PCI)” air purification technology, which uses plasma discharge to generate cluster ions. The new plasma cluster device releases into the air positive and negative ions, which are harmless to humans and are able to decompose and deactivate airborne substances by chemical reactions. A lot of phenomenological tests of the PCI air purification technology on microbial cells have been conducted. And, in most cases, it has been shown that PCI demonstrate strongly pronounced killing effect. Although, the particular mechanisms of PCI action are still not evident. We studied variations in resistance to PCI among gram-positive airborne microorganisms, as well as some dose-dependent, spatial, cultural and biochemical properties of PCI action in respect of Staphylococcus spp, Enterococcus spp, Micrococcus spp.
Recently, SHARP corporation has developed the world’s first "Plasma Cluster Ions® (PCI)" air purification technology, which uses plasma discharge to generate cluster ions. The new Plasma Cluster Device releases positive and negative ions into the air, which are harmless to humans and are able to decompose and deactivate airborne substances by chemical reactions. In the past, phenomenological tests on the efficacy of the PCI air purification technology on microbial cells have been conducted. In most cases, it has been shown that PCI demonstrated strongly pronounced killing effects on microorganisms. However, the particular mechanisms of PCI action still have to be uncovered.
Mechanical stimulation of the cells resulted in evident changes in the cell morphology, protein composition and gene expression. Microscopically, additional formation of stress fibers accompanied by cell re-arrangements in a monolayer was observed. Also, significant activation of p53 gene was revealed as compared to control. Interestingly, the use of CellTech membrane coating induced cell death after mechanical stress had been applied. Such an effect was not detected when fibronectin had been used as an adhesion substrate.
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
A melting probe equipped with autofluorescence-based detection system combined with a light scattering unit, and, optionally, with a microarray chip would be ideally suited to probe icy environments like Europa’s ice layer as well as the polar ice layers of Earth and Mars for recent and extinct live.
As a deduction from these results, we can conclude that proteins mainly in vitro, denaturate totally at a temperature between 57°C -62°C, and they also affected by NO and different ions types. In which mainly, NO cause earlier protein denaturation, which means that, NO has a destabilizing effect on proteins, and also different ions will alter the protein denaturation in which, some ions will cause earlier protein denaturation while others not.
The absence of a general method for endotoxin removal from liquid interfaces gives an opportunity to find new methods and materials to overcome this gap. Activated nanostructured carbon is a promising material that showed good adsorption properties due to its vast pore network and high surface area. The aim of this study is to find the adsorption rates for a carboneous material produced at different temperatures, as well as to reveal possible differences between the performance of the material for each of the adsorbates used during the study (hemoglobin, serum albumin and lipopolysaccharide, LPS).
One of interesting but not well known water properties is related to appearance of highly ordered structures in response to strong electrical field. In 1893 Sir William Armstrong placed a cotton thread between two wine glasses filled with chemically pure water. When high DC voltage was applied between the glasses, a connection consisting of water formed, producing a "water bridge"
We present the novel concept of a combined drilling and melting probe for subsurface ice research. This probe, named “IceMole”, is currently developed, built, and tested at the FH Aachen University of Applied Sciences’ Astronautical Laboratory. Here, we describe its first prototype design and report the results of its field tests on the Swiss Morteratsch glacier. Although the IceMole design is currently adapted to terrestrial glaciers and ice shields, it may later be modified for the subsurface in-situ investigation of extraterrestrial ice, e.g., on Mars, Europa, and Enceladus. If life exists on those bodies, it may be present in the ice (as life can also be found in the deep ice of Earth).
Tests with palm tree leaves have just started yet and scan data are in the process to be analyzed. The final goal of future project for palm tree gender and species recognition will be to develop optical scanning technology to be applied to date palm tree leaves for in–situ screening purposes. Depending on the software used and the particular requirements of the users the technology potentially shall be able to identify palm tree diseases, palm tree gender, and species of young date palm trees by scanning leaves.
The ”IceMole“ is a novel maneuverable subsurface ice probe for clean in-situ analysis and sampling of subsurface ice and subglacial water/brine. It is developed and build at FH Aachen University of Applied Sciences’ Astronautical Laboratory. A first prototype was successfully tested on the Swiss Morteratsch glacier in 2010. Clean sampling is achieved with a hollow ice screw (as it is used in mountaineering) at the tip of the probe. Maneuverability is achieved with a differentially heated melting head. Funded by the German Space Agency (DLR), a consortium led by FH Aachen currently develops a much more advanced IceMole probe, which includes a sophisticated system for obstacle avoidance, target detection, and navigation in the ice. We intend to use this probe for taking clean samples of subglacial brine at the Blood Falls (McMurdo Dry Valleys, East Antarctica) for chemical and microbiological analysis. In our conference contribution, we 1) describe the IceMole design, 2) report the results of the field tests of the first prototype on the Morteratsch glacier, 3) discuss the probe’s potential for the clean in-situ analysis and sampling of subsurface ice and subglacial liquids, and 4) outline the way ahead in the development of this technology.
The most of conventional methods of air purification use the power of a fan to draw in air and pass it through a filter. The problem of bacterial contamination of inner parts of such a type of air conditioners in some cases draws attention towards alternative air-cleaning systems. Some manufacturers offer to use the ozone's bactericidal and deodorizing effects, but the wide spreading of such systems is restricted by the fact that toxic effects of ozone in respect of human beings are well known. In 2000 Sharp Inc. introduced "Plasma Cluster Ions (PCI)" air purification technology, which uses plasma discharge to generate cluster ions (I 0-14 ). This technology has been developed for those customers that are conscious about health and hygiene. In our experiments, we focused on some principal aspects of plasma-generated ions application - time-dependency and irreversibility of bactericidal action, spatial and kinetic characteristics of emitted cluster particles, their chemical targets in the microbial cells.
Bacterial lipopolysaccharides (endotoxins) show strong biological effects at very low concentrations in human beings and many animals when entering the blood stream. These include affecting structure and function of organs and cells, changing metabolic functions, raising body temperature, triggering the coagulation cascade, modifying hemodynamics and causing septic shock. Because of this toxicity, the removal of even minute amounts is essential for safe parenteral administration of drugs and also for septic shock patients' care. The absence of a general method for endotoxin removal from liquid interfaces urgently requires finding new methods and materials to overcome this gap. Nanostructured carbonized plant parts is a promising material that showed good adsorption properties due to its vast pore network and high surface area. The aim of this study was comparative measurement of endotoxin- and blood proteins-related adsorption rate and adsorption capacity for different carboneous materials produced at different temperatures and under different surface modifications. As a main surface modificator, positively cbarged polymer, polyethileneimine (PEl) was used. Activated carbon materials showed good adsorption properties for LPS and some proteins used in the experiments. During the batch experiments, several techniques (dust removal, autoclaving) were used and optimized for improving the material's adsorption behavior. Also, with the results obtained it was possible to differentiate the materials according to their adsorption capacity and kinetic characteristics. Modification of the surface apparently has not affected hemoglobin binding to the adsorbent's surface. Obtained adsorption isotherms can be used as a powerful tool for designing of future column-based setups for blood purification from LPS, which is especially important for septic shock treatment.
The demand of replacements for inoperable organs exceeds the amount of available organ transplants. Therefore, tissue engineering developed as a multidisciplinary field of research for autologous in-vitro organs. Such three dimensional tissue constructs request the application of a bioreactor. The UREPLACE bioreactor is used to grow cells on tubular collagen scaffolds OPTIMAIX Sponge 1 with a maximal length of 7 cm, in order to culture in vitro an adequate ureter replacement. With a rotating unit, (urothelial) cells can be placed homogeneously on the inner scaffold surface. Furthermore, a stimulation is combined with this bioreactor resulting in an orientation of muscle cells. These culturing methods request a precise control of several parameters and actuators. A combination of a LabBox and the suitable software LabVision is used to set and conduct parameters like rotation angles, velocities, pressures and other important cell culture values. The bioreactor was tested waterproof successfully. Furthermore, the temperature controlling was adjusted to 37 °C and the CO2 - concentration regulated to 5 %. Additionally, the pH step responses of several substances showed a perfect functioning of the designed flow chamber. All used software was tested and remained stable for several days.
Plant physiology and plant stress: Plant physiology will be much more important for human mankind because of yield and cultivation limits of crops determined by their resistance to stress. To assess and counteract various stress factors it is necessary to conduct plant research to gain information and results on plant physiology.
The importance of the availability of stored blood or blood cells, respectively, for urgent transfusion cannot be overestimated. Nowadays, blood storage becomes even more important since blood products are used for epidemiological studies, bio-technical research or banked for transfusion purposes. Thus blood samples must not only be processed, stored, and shipped to preserve their efficacy and safety, but also all parameters of storage must be recorded and reported for Quality Assurance. Therefore, blood banks and clinical research facilities are seeking more accurate, automated means for blood storage and blood processing.
Proceedings of the 2nd Humboldt Kolleg, Hammamet, Tunisia Organizer: Alexander von Humboldt Stiftung, Germany. pdf 184 p. Welcome Address Dear Participants, Welcome to the 2nd Humboldt Kolleg in “Nanoscale Science and Technology” (NS&T’12) in Tunisia, sponsored by the "Alexander von Humboldt" foundation. The NS&T’12 multidisciplinary scientific program includes seven "hot" topics dealing with "Nanoscale Science and Technology" covering basic and application-oriented research as well as industrial (market) aspects: - Molecular Biophyics, Spectroscopy Techniques, Imaging Microscopy - Nanomaterials Synthesis for Medicine and Bio-chemical Sensors - Nanostructures, Semiconductors, Photonics and Nanodevices - New Technologies in Market Industry - Environment, Electro-chemistry, Bio-polymers and Fuel Cells - Nanomaterials, Photovoltaic, Modelling, Quantum Physics - Microelectronics, Sensors Networks and Embedded Systems We are deeply indebted to all members of the Scientific Committee and General Chairs for joint Sessions and to all speakers and chairmen, who have dedicated invaluable time and efforts for the realization of this event. On behalf of the Organizing Committee, we are cordially inviting you to join the conference and hope that your stay will be fruitful, rewarding and enjoyable. Prof. Dr. Michael J. Schöning, Prof. Dr. Adnane Abdelghani
Summary: This paper presents a methodology to study and understand the mechanics of stapled anastomotic behaviors by combining empirical experimentation and finite element analysis. Performance of stapled anastomosis is studied in terms of leakage and numerical results which are compared to in vitro experiments performed on fresh porcine tissue. Results suggest that leaks occur between the tissue and staple legs penetrating through the tissue.
Shock waves, explosions, impacts or cavitation bubble collapses may generate stress waves in solids causing cracks or unexpected dammage due to focussing, physical nonlinearity or interaction with existing cracks. There is a growing interest in wave propagation, which poses many novel problems to experimentalists and theorists.
In: Advanced Engineering Informatics. Vol 21, Issue 1, 2007, Pages 67-83 http://dx.doi.org/10.1016/j.aei.2006.10.001 eds. J.C. Kunz, I.F.C. Smith and T. Tomiyama, Elsevier, Seite 1-22 Current CAD tools are not able to support the conceptual design phase, and none of them provides a consistency analysis for sketches produced by architects. This phase is fundamental and crucial for the whole design and construction process of a building. To give architects a better support, we developed a CAD tool for conceptual design and a knowledge specification tool. The knowledge is specific to one class of buildings and it can be reused. Based on a dynamic and domain-specific knowledge ontology, different types of design rules formalize this knowledge in a graph-based form. An expressive visual language provides a user-friendly, human readable representation. Finally, a consistency analysis tool enables conceptual designs to be checked against this formal conceptual knowledge. In this article, we concentrate on the knowledge specification part. For that, we introduce the concepts and usage of a novel visual language and describe its semantics. To demonstrate the usability of our approach, two graph-based visual tools for knowledge specification and conceptual design are explained.
In: Proc. of the 11th Intl. Conf. on Computing in Civil and Building Engineering (ICCCBE-XI) ed. Hugues Rivard, Montreal, Canada, Seite 1-12, ACSE (CD-ROM), 2006 Currently, the conceptual design phase is not adequately supported by any CAD tool. Neither the support while elaborating conceptual sketches, nor the automatic proof of correctness with respect to effective restrictions is currently provided by any commercial tool. To enable domain experts to store the common as well as their personal domain knowledge, we develop a visual language for knowledge formalization. In this paper, a major extension to the already existing concepts is introduced. The possibility to define rule dependencies extends the expressiveness of the knowledge definition language and contributes to the usability of our approach.
Proc. of the 2005 ASCE Intl. Conf. on Computing in Civil Engineering (ICCC 2005) eds. L. Soibelman und F. Pena-Mora, Seite 1-14, ASCE (CD-ROM), Cancun, Mexico, 2005 Current CAD tools are not able to support the fundamental conceptual design phase, and none of them provides consistency analyses of sketches produced by architects. To give architects a greater support at the conceptual design phase, we develop a CAD tool for conceptual design and a knowledge specification tool allowing the definition of conceptually relevant knowledge. The knowledge is specific to one class of buildings and can be reused. Based on a dynamic knowledge model, different types of design rules formalize the knowledge in a graph-based realization. An expressive visual language provides a user-friendly, human readable representation. Finally, consistency analyses enable conceptual designs to be checked against this defined knowledge. In this paper we concentrate on the knowledge specification part of our project.
In: Computer Aided Architectural Design Futures 2005 2005, Part 4, 207-216, DOI: http://dx.doi.org/10.1007/1-4020-3698-1_19 The conceptual design at the beginning of the building construction process is essential for the success of a building project. Even if some CAD tools allow elaborating conceptual sketches, they rather focus on the shape of the building elements and not on their functionality. We introduce semantic roomobjects and roomlinks, by way of example to the CAD tool ArchiCAD. These extensions provide a basis for specifying the organisation and functionality of a building and free architects being forced to directly produce detailed constructive sketches. Furthermore, we introduce consistency analyses of the conceptual sketch, based on an ontology containing conceptual relevant knowledge, specific to one class of buildings.
IASSE-2004 - 13th International Conference on Intelligent and Adaptive Systems and Software Engineering eds. W. Dosch, N. Debnath, pp. 245-250, ISCA, Cary, NC, 1-3 July 2004, Nice, France We introduce a UML-based model for conceptual design support in civil engineering. Therefore, we identify required extensions to standard UML. Class diagrams are used for elaborating building typespecific knowledge: Object diagrams, implicitly contained in the architect’s sketch, are validated against the defined knowledge. To enable the use of industrial, domain-specific tools, we provide an integrated conceptual design extension. The developed tool support is based on graph rewriting. With our approach architects are enabled to deal with semantic objects during early design phase, assisted by incremental consistency checks.
In: Net-distributed Co-operation : Xth International Conference on Computing in Civil and Building Engineering, Weimar, June 02 - 04, 2004 ; proceedings / [ed. by Karl Beuke ...] . - Weimar: Bauhaus-Univ. Weimar 2004. - 1. Aufl. . Seite 1-14 ISBN 3-86068-213-X International Conference on Computing in Civil and Building Engineering <10, 2004, Weimar> Summary In our project, we develop new tools for the conceptual design phase. During conceptual design, the coarse functionality and organization of a building is more important than a detailed worked out construction. We identify two roles, first the knowledge engineer who is responsible for knowledge definition and maintenance; second the architect who elaborates the conceptual de-sign. The tool for the knowledge engineer is based on graph technology, it is specified using PROGRES and the UPGRADE framework. The tools for the architect are integrated to the in-dustrial CAD tool ArchiCAD. Consistency between knowledge and conceptual design is en-sured by the constraint checker, another extension to ArchiCAD.
ITCE-2003 - 4th Joint Symposium on Information Technology in Civil Engineering ed Flood, I., Seite 1-12, ASCE (CD-ROM), Nashville, USA In this paper we discussed graph based tools to support architects during the conceptual design phase. Conceptual Design is defined before constructive design; the used concepts are more abstract. We develop two graph based approaches, a topdown using the graph rewriting system PROGRES and a more industrially oriented approach, where we extend the CAD system ArchiCAD. In both approaches, knowledge can be defined by a knowledge engineer, in the top-down approach in the domain model graph, in the bottom-up approach in the in an XML file. The defined knowledge is used to incrementally check the sketch and to inform the architect about violations of the defined knowledge. Our goal is to discover design error as soon as possible and to support the architect to design buildings with consideration of conceptual knowledge.
Applications of Graph Transformations with Industrial Relevance Lecture Notes in Computer Science, 2004, Volume 3062/2004, 434-439, DOI: http://dx.doi.org/10.1007/978-3-540-25959-6_33 This paper gives a brief overview of the tools we have developed to support conceptual design in civil engineering. Based on the UPGRADE framework, two applications, one for the knowledge engineer and another for architects allow to store domain specific knowledge and to use this knowledge during conceptual design. Consistency analyses check the design against the defined knowledge and inform the architect if rules are violated.
In: Advances in intelligent computing in engineering : proceedings of the 9.International EG-ICE Workshop ; Darmstadt, (01 - 03 August) 2002 / Martina Schnellenbach-Held ... (eds.) . - Düsseldorf: VDI-Verl., 2002 .- Fortschritt-Berichte VDI, Reihe 4, Bauingenieurwesen ; 180 ; S. 1-35 The paper describes a novel way to support conceptual design in civil engineering. The designer uses semantical tools guaranteeing certain internal structures of the design result but also the fulfillment of various constraints. Two different approaches and corresponding tools are discussed: (a) Visually specified tools with automatic code generation to determine a design structure as well as fixing various constraints a design has to obey. These tools are also valuable for design knowledge specialist. (b) Extensions of existing CAD tools to provide semantical knowledge to be used by an architect. It is sketched how these different tools can be combined in the future. The main part of the paper discusses the concepts and realization of two prototypes following the two above approaches. The paper especially discusses that specific graphs and the specification of their structure are useful for both tool realization projects.
Many companies still conduct the worldwide management of people as if neither the external economic nor the internal structure of the firm had changed. The costs of cross-cultural failure, for individuals and their companies, are enormous: personal and family costs; financial, professional and emotional costs; costs to one’s career prospects, to one’s self-esteem, to one’s marriage and family. This scenario describes sufficiently the reason for learning “the art of crossing cultures” (Craig Storti). To this end, this research paper describes an innovative approach of cross-cultural training, following the didactical ideas of Kolb and Fry, the so-called 'experiential learning'.
Working paper distributed at 2nd Annual Next Generation Telecommunications Conference 2009, 13th – 14th October 2009, Brussels 14 pages Abstract Governments all over Europe are in the process of adopting new broadband strategies. The objective is to create modern telecommunications networks based on powerful broadband infrastructures". In doing so, they aim for innovative and investment-friendly concepts. For instance, in a recently published consultation paper on the subject the German regulator BNetzA declared that it will take “greater account of … reducing risks, securing the investment and innovation power, providing planning certainty and transparency – in order to support and advance broadband rollout in Germany”. It further states that when regulating wholesale rates it has to be ensured that “… adequate incentives for network rollout are provided on the one hand, while sustainable and fair competition is ensured on the other”. Also an EC draft recommendation on regulated network access is about to set new standards for the regulation of next generation access networks. According to the recommendation the prices of new assets shall be based on costs plus a projectspecific risk premium to be included in the costs of capital for the investment risk incurred by the operator. This approach has been criticised from various sides. In particular it has been questioned whether such an approach is adequate to meet the objectives of encouraging both competition and investment into next generation access networks. Against this background, the concept of “long term risk sharing contracts” has been proposed recently as an approach which does not only incorporate the various additional risks involved in the deployment of NGA infrastructure, but has several other advantages. This paper will demonstrate that the concept allows for competition to evolve at both the retail and wholesale level on fair, objective, non-discriminatory and transparent terms and conditions. Moreover, it ensures the highest possible investment incentive in line with socially desirable outcome. The paper is organised as follows: The next section will briefly outline the importance of encouraging competition and investment in an NGA-environment. The third section will specify the design of long term risk sharing contracts in view of achieving these objectives. The fourth section will examine potential problems associated with the concept. In doing so a way of how to deal with them will be elaborated. The last section will look at arguments against long term risk sharing contracts. It will be shown that these arguments are not strong enough to build a case against introducing such contracts.
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".