Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (77)
- IfB - Institut für Bioengineering (38)
- Fachbereich Energietechnik (19)
- Fachbereich Wirtschaftswissenschaften (12)
- Fachbereich Maschinenbau und Mechatronik (9)
- Fachbereich Elektrotechnik und Informationstechnik (6)
- Fachbereich Luft- und Raumfahrttechnik (6)
- INB - Institut für Nano- und Biotechnologien (6)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (4)
- Fachbereich Bauingenieurwesen (3)
Has Fulltext
- yes (132) (remove)
Language
- English (132) (remove)
Document Type
- Conference Proceeding (132) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Blitzschutz (6)
- Clusterion (4)
- Sonde (4)
- Telekommunikationsmarkt (4)
- Air purification (3)
Functional testing and characterisation of ISFETs on wafer level by means of a micro-droplet cell
(2006)
A wafer-level functionality testing and characterisation system for ISFETs (ionsensitive field-effect transistor) is realised by means of integration of a specifically designed capillary electrochemical micro-droplet cell into a commercial wafer prober-station. The developed system allows the identification and selection of “good” ISFETs at the earliest stage and to avoid expensive bonding, encapsulation and packaging processes for nonfunctioning ISFETs and thus, to decrease costs, which are wasted for bad dies. The developed system is also feasible for wafer-level characterisation of ISFETs in terms of sensitivity, hysteresis and response time. Additionally, the system might be also utilised for wafer-level testing of further electrochemical sensors.
A new and simple method for nanostructuring using conventional photolithography and layer expansion or pattern-size reduction technique is presented, which can further be applied for the fabrication of different nanostructures and nano-devices. The method is based on the conversion of a photolithographically patterned metal layer to a metal-oxide mask with improved pattern-size resolution using thermal oxidation. With this technique, the pattern size can be scaled down to several nanometer dimensions. The proposed method is experimentally demonstrated by preparing nanostructures with different configurations and layouts, like circles, rectangles, trapezoids, “fluidic-channel”-, “cantilever”- and meander-type structures.
In this paper, methods of surface modification of different supports, i.e. glass and polymeric beads for enzyme immobilisation are described. The developed method of enzyme immobilisation is based on Schiff’s base formation between the amino groups on the enzyme surface and the aldehyde groups on the chemically modified surface of the supports. The surface of silicon modified by APTS and GOPS with immobilised enzyme was characterised by atomic force microscopy (AFM), time-of-flight secondary ion mass spectroscopy (ToF-SIMS) and infrared spectroscopy (FTIR). The supports with immobilised enzyme (urease) were also tested in combination with microreactors fabricated in silicon and Perspex, operating in a flow-through system. For microreactors filled with urease immobilised on glass beads (Sigma) and on polymeric beads (PAN), a very high and stable signal (pH change) was obtained. The developed method of urease immobilisation can be stated to be very effective.
In this paper, methods of sample preparation for potentiometric measurement of phenylalanine are presented. Basing on the spectrophotometric measurements of phenylalanine, the concentrations of reagents of the enzymatic reaction (10 mM L-Phe, 0,4 mM NAD+, 2U L-PheDH) were determined. Then, the absorption spectrum of the reaction product, NADH, was monitored (maximum peak at 340 nm). The results obtained by the spectrophotometric method were compared with the results obtained by the colourimetry, using pH indicators. The above-mentioned two methods will be used as references for potentiometric measurements of phenylalanine concentration.
The absence of a general method for endotoxin removal from liquid interfaces gives an opportunity to find new methods and materials to overcome this gap. Activated nanostructured carbon is a promising material that showed good adsorption properties due to its vast pore network and high surface area. The aim of this study is to find the adsorption rates for a carboneous material produced at different temperatures, as well as to reveal possible differences between the performance of the material for each of the adsorbates used during the study (hemoglobin, serum albumin and lipopolysaccharide, LPS).
An H2O2 sensor for the application in industrial sterilisation processes has been developed. Therefore, automated sterilisation equipment at laboratory scale has been constructed using parts from industrial sterilisation facilities. In addition, a software tool has been developed for the control of the sterilisation equipment at laboratory scale. First measurements with the developed sensor set-up as part of the sterilisation equipment have been performed and the sensor has been physically characterised by optical microscopy and SEM.
With autonomous mobile robots receiving increased
attention in industrial contexts, the need for benchmarks
becomes more and more an urgent matter. The RoboCup
Logistics League (RCLL) is one specific industry-inspired scenario
focusing on production logistics within a Smart Factory.
In this paper, we describe how the RCLL allows to assess the
performance of a group of robots within the scenario as a
whole, focusing specifically on the coordination and cooperation
strategies and the methods and components to achieve them.
We report on recent efforts to analyze performance of teams in
2014 to understand the implications of the current grading
scheme, and derived criteria and metrics for performance
assessment based on Key Performance Indicators (KPI) adapted
from classic factory evaluation. We reflect on differences and
compatibility towards RoCKIn, a recent major benchmarking
European project.
Unsteady flow measurements in the wake behind a wind-tunnel car model by using high-speed planar PIV
(2015)
This study investigates unsteady characteristics of the wake behind a 28%-scale car model in a wind tunnel using highspeed planar particle image velocimetry (PIV). The car model is based on a hatchback passenger car that is known to have relatively high fluctuations in its aerodynamic loads. This study primarily focuses on the lateral motion of the flow on the horizontal plane to determine the effect of the flow motion on the straight-line stability and the initial steering response of the actual car on a track. This paper first compares the flow fields in the wake behind the above mentioned model obtained using conventional and high-speed planar PIV, with sampling frequencies of 8 Hz and 1 kHz, respectively. Large asymmetrically coherent flow structures, which fluctuate at frequencies below 2 Hz, are observed in the results of highspeed PIV measurements, whereas conventional PIV is unable to capture these features of the flow owing to aliasing. This flow pattern with a laterally swaying motion is represented by opposite signs of cross-correlation coefficients of streamwise velocity fluctuations for the two sides of the car model. Effects of two aerodynamic devices that are known to reduce the
fluctuation levels of the aerodynamic loads are then extensively investigated. The correlation analyses reveal that these devices indeed reduce the fluctuation levels of the flow and the correlation values around the rear combination-lamp, but it is found that the effects of these devices are different around the c-pillar.
Quartz crystal nanobalance (QCN) sensors are considered as powerful masssensitive sensors to determine materials in the sub-nanogram level. In this study, a single piezoelectric quartz crystal nanobalance modified with polystyrene was employed to detect benzene, toluene, ethylbenzene and xylene (BTEX compounds). The frequency shift of the QCN sensor was found to be linear against the BTEX compound concentrations in the range about 1-45 mg l-1. The correlation coefficients for benzene, toluene, ethylbenzene, and xylene were 0.991, 0.9977, 0.9946 and 0.9971, respectively. The principal component analysis was also utilized to process the frequency response data of the single piezoelectric crystal at different times, considering to the different adsorption-desorption dynamics of BTEX compounds. Using principal component analysis, it was found that over 90% of the data variance could still be explained by use of two principal components (PC1 and PC2). Subsequently, the successful identification of benzene and toluene was possible through the principal component analysis of the transient responses of the polystyrene modified QCN sensor. The results showed that the polystyrene-modified QCN had favorable identification and quantification performances for the BTEX compounds.
RaWid was the German national technology programme on transonic aerodynamics and supporting technologies, lasting from 1995 to 1998. One of the main topics was laminar wing development. Besides aerodynamic design work, many operational aspects were investigated. A manufacturing concept was developed to be applied to operational laminar wings and empennages. It was built in a large scale manufacturing demonstrator with the aerodynamic shape of a 1,5 m section of the A320 fin nose. Tolerances in shape and roughness fulfilled all requirements. The construction can easily be adapted to varying stiffness and strength requirements. Weight and manufacturing costs are comparable to common nose designs. The mock-up to be designed in ALTTA is based on this manufacturing principle. Another critical point is contamination of suction surfaces. Several tests were performed to investigate perforated titanium suction surfaces at realistic operational conditions: - a one year flight test with a suction plate in the stagnation area of the Airbus "Beluga" - a one year test of several suction plates in a ground test near the airport - a one year test of a working suction ground test installation at all weather conditions. No critical results were found. There is no long term suction degradation visible. Icing conditions and ground de-icing fluids used on airports did not pose severe problems. Some problems detected require only respection of weak design constraints.
Hands-on-training in high technology areas is usually limited due to the high cost for lab infrastructure and equipment. One specific example is the field of MEMS, where investment and upkeep of clean rooms with microtechnology equipment is either financed by production or R&D projects greatly reducing the availability for education purposes. For efficient hands-on-courses a MEMS training foundry, currently used jointly by six higher education institutions, was established at FH Kaiserslautern. In a typical one week course, students manufacture a micromachined pressure sensor including all lithography, thin film and packaging steps. This compact and yet complete program is only possible because participants learn to use the different complex machines in advance via a Virtual Training Lab (VTL). In this paper we present the concept of the MEMS training foundry and the VTL preparation together with results from a scientific evaluation of the VTL over the last three years.
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
This paper reports a first microbial biosensor for rapid and cost-effective determination of organophosphorus pesticides fenitrothion and EPN. The biosensor consisted of recombinant PNP-degrading/oxidizing bacteria Pseudomonas putida JS444 anchoring and displaying organophosphorus hydrolase (OPH) on its cell surface as biological sensing element and a dissolved oxygen electrode as the transducer. Surfaceexpressed OPH catalyzed the hydrolysis of fenitrothion and EPN to release 3-methyl-4-nitrophenol and p-nitrophenol, respectively, which were oxidized by the enzymatic machinery of Pseudomonas putida JS444 to carbon dioxide while consuming oxygen, which was measured and correlated to the concentration of organophosphates. Under the optimum operating conditions, the biosensor was able to measure as low as 277 ppb of fenitrothion and 1.6 ppm of EPN without interference from phenolic compounds and other commonly used pesticides such as carbamate pesticides, triazine herbicides and organophosphate pesticides without nitrophenyl substituent. The applicability of the biosensor to lake water was also demonstrated.
An array of 50 MHz quartz microbalances (QMBs) coated with a dendronized polymer was used to detect small amounts of volatile organic compounds (VOCs) in the gas phase. The results were compared to those obtained with the commonly used 10 MHz QMBs. The 50 MHz QMBs proved to be a powerful tool for the detection of VOCs in the gas phase; therefore, they represent a promising alternative to the much more delicate surface acoustic wave devices (SAWs).
Proc. of the 2005 ASCE Intl. Conf. on Computing in Civil Engineering (ICCC 2005) eds. L. Soibelman und F. Pena-Mora, Seite 1-14, ASCE (CD-ROM), Cancun, Mexico, 2005 Current CAD tools are not able to support the fundamental conceptual design phase, and none of them provides consistency analyses of sketches produced by architects. To give architects a greater support at the conceptual design phase, we develop a CAD tool for conceptual design and a knowledge specification tool allowing the definition of conceptually relevant knowledge. The knowledge is specific to one class of buildings and can be reused. Based on a dynamic knowledge model, different types of design rules formalize the knowledge in a graph-based realization. An expressive visual language provides a user-friendly, human readable representation. Finally, consistency analyses enable conceptual designs to be checked against this defined knowledge. In this paper we concentrate on the knowledge specification part of our project.
In: Net-distributed Co-operation : Xth International Conference on Computing in Civil and Building Engineering, Weimar, June 02 - 04, 2004 ; proceedings / [ed. by Karl Beuke ...] . - Weimar: Bauhaus-Univ. Weimar 2004. - 1. Aufl. . Seite 1-14 ISBN 3-86068-213-X International Conference on Computing in Civil and Building Engineering <10, 2004, Weimar> Summary In our project, we develop new tools for the conceptual design phase. During conceptual design, the coarse functionality and organization of a building is more important than a detailed worked out construction. We identify two roles, first the knowledge engineer who is responsible for knowledge definition and maintenance; second the architect who elaborates the conceptual de-sign. The tool for the knowledge engineer is based on graph technology, it is specified using PROGRES and the UPGRADE framework. The tools for the architect are integrated to the in-dustrial CAD tool ArchiCAD. Consistency between knowledge and conceptual design is en-sured by the constraint checker, another extension to ArchiCAD.
In: Computer Aided Architectural Design Futures 2005 2005, Part 4, 207-216, DOI: http://dx.doi.org/10.1007/1-4020-3698-1_19 The conceptual design at the beginning of the building construction process is essential for the success of a building project. Even if some CAD tools allow elaborating conceptual sketches, they rather focus on the shape of the building elements and not on their functionality. We introduce semantic roomobjects and roomlinks, by way of example to the CAD tool ArchiCAD. These extensions provide a basis for specifying the organisation and functionality of a building and free architects being forced to directly produce detailed constructive sketches. Furthermore, we introduce consistency analyses of the conceptual sketch, based on an ontology containing conceptual relevant knowledge, specific to one class of buildings.
In: Proc. of the 11th Intl. Conf. on Computing in Civil and Building Engineering (ICCCBE-XI) ed. Hugues Rivard, Montreal, Canada, Seite 1-12, ACSE (CD-ROM), 2006 Currently, the conceptual design phase is not adequately supported by any CAD tool. Neither the support while elaborating conceptual sketches, nor the automatic proof of correctness with respect to effective restrictions is currently provided by any commercial tool. To enable domain experts to store the common as well as their personal domain knowledge, we develop a visual language for knowledge formalization. In this paper, a major extension to the already existing concepts is introduced. The possibility to define rule dependencies extends the expressiveness of the knowledge definition language and contributes to the usability of our approach.
ITCE-2003 - 4th Joint Symposium on Information Technology in Civil Engineering ed Flood, I., Seite 1-12, ASCE (CD-ROM), Nashville, USA In this paper we discussed graph based tools to support architects during the conceptual design phase. Conceptual Design is defined before constructive design; the used concepts are more abstract. We develop two graph based approaches, a topdown using the graph rewriting system PROGRES and a more industrially oriented approach, where we extend the CAD system ArchiCAD. In both approaches, knowledge can be defined by a knowledge engineer, in the top-down approach in the domain model graph, in the bottom-up approach in the in an XML file. The defined knowledge is used to incrementally check the sketch and to inform the architect about violations of the defined knowledge. Our goal is to discover design error as soon as possible and to support the architect to design buildings with consideration of conceptual knowledge.
In: Advances in intelligent computing in engineering : proceedings of the 9.International EG-ICE Workshop ; Darmstadt, (01 - 03 August) 2002 / Martina Schnellenbach-Held ... (eds.) . - Düsseldorf: VDI-Verl., 2002 .- Fortschritt-Berichte VDI, Reihe 4, Bauingenieurwesen ; 180 ; S. 1-35 The paper describes a novel way to support conceptual design in civil engineering. The designer uses semantical tools guaranteeing certain internal structures of the design result but also the fulfillment of various constraints. Two different approaches and corresponding tools are discussed: (a) Visually specified tools with automatic code generation to determine a design structure as well as fixing various constraints a design has to obey. These tools are also valuable for design knowledge specialist. (b) Extensions of existing CAD tools to provide semantical knowledge to be used by an architect. It is sketched how these different tools can be combined in the future. The main part of the paper discusses the concepts and realization of two prototypes following the two above approaches. The paper especially discusses that specific graphs and the specification of their structure are useful for both tool realization projects.
Applications of Graph Transformations with Industrial Relevance Lecture Notes in Computer Science, 2004, Volume 3062/2004, 434-439, DOI: http://dx.doi.org/10.1007/978-3-540-25959-6_33 This paper gives a brief overview of the tools we have developed to support conceptual design in civil engineering. Based on the UPGRADE framework, two applications, one for the knowledge engineer and another for architects allow to store domain specific knowledge and to use this knowledge during conceptual design. Consistency analyses check the design against the defined knowledge and inform the architect if rules are violated.
The workflow of a high throughput screening setup for the rapid identification of new and improved sensor materials is presented. The polyol method was applied to prepare nanoparticular metal oxides as base materials, which were functionalised by surface doping. Using multi-electrode substrates and high throughput impedance spectroscopy (HT-IS) a wide range of materials could be screened in a short time. Applying HT-IS in search of new selective gas sensing materials a NO2-tolerant NO sensing material with reduced sensitivities towards other test gases was identified based on iridium doped zinc oxide. Analogous behaviour was observed for iridium doped indium oxide.
IASSE-2004 - 13th International Conference on Intelligent and Adaptive Systems and Software Engineering eds. W. Dosch, N. Debnath, pp. 245-250, ISCA, Cary, NC, 1-3 July 2004, Nice, France We introduce a UML-based model for conceptual design support in civil engineering. Therefore, we identify required extensions to standard UML. Class diagrams are used for elaborating building typespecific knowledge: Object diagrams, implicitly contained in the architect’s sketch, are validated against the defined knowledge. To enable the use of industrial, domain-specific tools, we provide an integrated conceptual design extension. The developed tool support is based on graph rewriting. With our approach architects are enabled to deal with semantic objects during early design phase, assisted by incremental consistency checks.
Lightning protection design of a renewable energy hybrid-system without power mains connection
(2001)
In the year 2000 a direct lightning strike to the hybridsystem without power mains connection VATALI on the Greek island Crete results in the destruction and damage of some mechanical and electrical components. The hybrid-system VATALI was not lightning protected at that time. The hardware damage costs are approx. 60,000 €. The exposed site of the hybrid-system on top of a mountain was and still is the reason for a high risk of lightning strikes. Also in the future further lightning strikes have to be taken into consideration. In the paper a fundamental lightning protection design concept for renewable energy hybrid-systems without power mains connection and protection measures against direct strikes and overvoltages are shown in detail. The design concept was realized exemplarily for the hybrid-system VATALI. The hardware costs for the protection measures were about 15,000 €. About 50% of the costs are due to protection measures against direct strikes, 50% are due to overvoltage protection. Future extensions, new installations, or modifications have to be included into the lightning protection design concept of the hybrid-system.
In the paper a lightning protection design concept for renewable energy hybrid-systems without power mains connection is described. Based on a risk analysis protection measures against direct strikes and overvoltages are shown in an overview. The design concept is realized exemplarily for the hybrid-system VATALI on the Greek island Crete. VATALI, not lightning protected at that time, was a victim of a lightning strike in the year 2000 causing destructions and damages of some mechanical and electrical components with costs of approx. 60.000 €. The hardware costs for the protection measures were about 15.000 €: about 50% of the costs are due to protection measures against direct strikes, 50% are due to overvoltage protection.
In the paper the results obtained from experiments at a modelled reinforced building in case of a direct lightning strike are compared with calculations. The comparison includes peak values of the magnetic field Hmax, its derivative (dH/dt)max and of induced voltages umax in typical cable routings. The experiments are performed at a 1:6 scaled building and the results are extrapolated using the similarity relations theory. The calculations are based on the approximate formulae given in IEC 62305-4 and have to be supplemented by a rough estimation of the additional shielding effect of a second reinforcement layer. The comparison shows, that the measured peak values of the magnetic field and its derivative are mostly lower than the calculated. The induced voltages are in good agreement. Hence, calculations of the induced voltages based on IEC 62305-4 are a good method for lightning protection studies of buildings, where the reinforcement is used as a grid-like electromagnetic shield.
In the presented paper data collected from the field related to damage statistics of electrical and electronic apparatus in household are reported and investigated. These damages (total number approx. 74000 cases), registered by five German insurance companies in 2005 and 2006, were adviced by customers as caused by lightning overvoltages. With the use of stochastical methods it is possible, to reasses the collected data and to distinguish between cases, which are with high probability caused by lightning overvoltages, and those, which are not. If there was an indication for a direct strike, this case was excluded, so the focus was only on indirect lightning flashes, i.e. only flashes to ground near the structure and flashes to or nearby an incoming service line were investigated. The data from the field contain the location of damaged apparatus (residence of the policy holder) and the distances of the nearest cloud-to-ground stroke to the location of the damage registered by the German lightning location network BLIDS at the date of damage. The statistical data along with some complementary numerical simulations allow to verify the correspondence of the Standards rules used for IEC 62305-2 with the field data and to define some correction needs. The results could lead to a better understanding whether a damage reported to an insurance company is really caused by indirect lightning, or not.
Hydrophobic magnetic nanoparticles (NPs) consisting of undecanoate-capped magnetite (Fe3O4, average diameter ca. 5 nm) are used to control quantized electron transfer to surface-confined redox units and metal NPs. A two-phase system consisting of an aqueous electrolyte solution and a toluene phase that includes the suspended undecanoatecapped magnetic NPs is used to control the interfacial properties of the electrode surface. The attracted magnetic NPs form a hydrophobic layer on the electrode surface resulting in the change of the mechanisms of the surface-confined electrochemical processes. A quinone-monolayer modified Au electrode demonstrates an aqueous-type of the electrochemical process (2e-+2H+ redox mechanism) for the quinone units in the absence of the hydrophobic magnetic NPs, while the attraction of the magnetic NPs to the surface results in the stepwise single-electron transfer mechanism characteristic of a dry nonaqueous medium. Also, the attraction of the hydrophobic magnetic NPs to the Au electrode surface modified with Au NPs (ca. 1.4 nm) yields a microenvironment with a low dielectric constant that results in the single-electron quantum charging of the Au NPs.
Electromechanical model of hiPSC-derived ventricular cardiomyocytes cocultured with fibroblasts
(2018)
The CellDrum provides an experimental setup to study the mechanical effects of fibroblasts co-cultured with hiPSC-derived ventricular cardiomyocytes. Multi-scale computational models based on the Finite Element Method are developed. Coupled electrical cardiomyocyte-fibroblast models (cell level) are embedded into reaction-diffusion equations (tissue level) which compute the propagation of the action potential in the cardiac tissue. Electromechanical coupling is realised by an excitation-contraction model (cell level) and the active stress arising during contraction is added to the passive stress in the force balance, which determines the tissue displacement (tissue level). Tissue parameters in the model can be identified experimentally to the specific sample.
Abstracts of the ACHEMA 2000 - International Meeting on Chemical Engineering, Environmental Protection and Biotechnology, May 22 - 27, 2000. Frankfurt am Main. Achema 2000 : special edition / Linde. [Ed.: Linde AG. Red.: Volker R. Leski]. - Wiesbaden : Linde AG, 2000. - 56 p. : Ill., . - pp: 79 - 81
The Passivhaus building standard is a concept developed for the realization of energy-efficient and economical buildings with a simultaneous high utilization comfort under European climate conditions. Major elements of the Passivhaus concept are a high thermal insulation of the external walls, the use of heat and/or solar shading glazing as well as an airtight building envelope in combination with energy-efficient technical building installations and heating or cooling generators, such as an efficient energy-recovery in the building air-conditioning. The objective of this research project is the inquiry to determine the parameters or constraints under which the Passivhaus concept can be implemented under the arid climate conditions in the Arabian Peninsula to achieve an energy-efficient and economical building with high utilization comfort. In cooperation between the Qatar Green Building Council (QGBC), Barwa Real Estate (BRE) and Kahramaa the first Passivhaus was constructed in Qatar and on the Arabian Peninsula in 2013. The Solar-Institut Jülich of Aachen University of Applied Science supports the Qatar Green Building Council with a dynamic building and equipment simulation of the Passivhaus and the neighbouring reference building. This includes simulation studies with different component configurations for the building envelope and different control strategies for heating or cooling systems as well as the air conditioning of buildings to find an energetic-economical optimum. Part of these analyses is the evaluation of the energy efficiency of the used energy recovery system in the Passivhaus air-conditioning and identification of possible energy-saving effects by the use of a bypass function integrated in the heat exchanger. In this way it is expected that on an annual basis the complete electricity demand of the building can be covered by the roof-integrated PV generator.
Table of contents 1. Introduction 2. Multi-level Technology Transfer Infrastructure 2.1 Level 1: University Education – Encourage the Idea of becoming an Entrepreneur 2.2 Level 2: Post Graduate Education – Improve your skills and focus it on a product family. 2.3 Level 3: Birth of a Company – Focus your skills on a product and a market segment. 2.4 Level 4: Ready to stand alone – Set up your own business 2.5 Level 5: Grow to be Strong – Develop your business 2.6 Level 6: Competitive and independent – Stay innovative. 3. Samples 3.1 Sample 1: Laser Processing and Consulting Centre, LBBZ 3.2 Sample 2: Prototyping Centre, CP 4. Funding - Waste money or even lost Money? 5. Conclusion
GaAs-based Gunn diodes with graded AlGaAs hot electron injector heterostructures have been developed under the special needs in automotive applications. The fabrication of the Gunn diode chips was based on total substrate removal and processing of integrated Au heat sinks. Especially, the thermal and RF behavior of the diodes have been analyzed by DC, impedance and S-parameter measurements. The electrical investigations have revealed the functionality of the hot electron injector. An optimized layer structure could fulfill the requirements in adaptive cruise control (ACC) systems at 77 GHz with typical output power between 50 and 90 mW.
On 1st January 1998, the German telecom market was fully liberalised. Since then genuine competition between market participants has developed, based on a comprehensive legal and regulatory framework that provides for safeguards against unfair competition and market power by Deutsche Telekom. Today, about 10 years after the liberalisation of the telecommunications sector a revision of this regulatory approach has become necessary because at least on three dimensions the situation is quite different from the one 10 years ago: First, with numerous established alternative operators in the market monopolies have been successfully challenged and competition introduced. Second, not only is Cable TV becoming in large parts of Germany a viable alternative for the provision of broadband services but also mobile services are becoming increasingly a substitute for fixed services. Last but not least there are important technological changes under way, requiring huge investments in infrastructure upgrades for next generation networks. In the light of these new developments the question is to which extent the current regulatory approach of severe ex-ante regulatory intervention is still appropriate. Is any part of the network of the former incumbent still a bottleneck? A more light handed regulatory approach might be the right response to this new situation. The paper is organised as follows: The first section will briefly examine the economic rationale for regulating network access. Based on the assumption that regulation is always necessary when bottlenecks exist regulatory principles for an efficient network access regime will be derived. The second section compares the situation of the German market in early 1998 with the one of today. Thereby three dimensions will be considered: the degree of competition, the potential for substitution and technological developments. The third section will define some requirements for the future regulation of telecom markets. Proposals will be elaborated how to ensure competitive telecom markets in the light of new economic and technological challenges.
A key feature of future broadband markets will be diversity of access technologies, meaning that numerous technologies will be exploited for broadband communication. Various factors will affect the success of these future broadband markets, the regulatory policy being one amongst others. So far, a coherent regulatory approach does not exist as to broadband markets. First results of policies so far suggest that less sector-specific regulation is likely to occur. Instead, regulators must ensure that access to networks and services of potentially dominant providers in a relevant broadband market will satisfy requirements for openness and non-discrimination. In this environment the future challenge of regulationg broadband markets will be to set the right incentives for investment into new infrastructures. This paper examines whether there is a need for the regulation of future broadband access markets an if yes, what is the appropriate regulatory tool to do so. Thereby the focus is on the analysis of European broadband markets and the regulatory approaches applied. The first section provides a description of the characteristics of future broadband markets. The second section discusses possible bottlenecks on broadband markets an their regulatory implications. The third section will examine regulatory issues concerning access to broadband networks in more detail. This will be done by comparing the regulatory approaches of European countries and the results in terms of bradband penetration. The final section will give key recommendations for a regulatory strategy on brandband access markets.
Market data for the German telecom market shows that Deutsche Telekom as the former incumbent is constantly loosing shares on all arkets for voice telephony: the market for local calls, the market for long-distance calls and the market for international calls. At the same time prices decline steadily with the latest trend being that operators offer voice services free of charge, the costs of which are covered by a monthly subscription charge. Against this background the paper examines the state of policy and regulatory reform in the telecommunications sector in Germany almost 10 years after the liberalisation of the fixed telecommunications market. Thereby the focus is on the analysis of the competitive conditions that have been established on the German market for voice telephony services. If these retail markets are competitive, there might be a need to remove remaining regulatory provisions. In the new environment of converging markets the future challenge of regulating fixed telecom markets might be to ensure that access to the network and/or services of a potentially dominant provider in a relevant market will satisfy requirements for openness and non-discrimination.
Working paper distributed at 2nd Annual Next Generation Telecommunications Conference 2009, 13th – 14th October 2009, Brussels 14 pages Abstract Governments all over Europe are in the process of adopting new broadband strategies. The objective is to create modern telecommunications networks based on powerful broadband infrastructures". In doing so, they aim for innovative and investment-friendly concepts. For instance, in a recently published consultation paper on the subject the German regulator BNetzA declared that it will take “greater account of … reducing risks, securing the investment and innovation power, providing planning certainty and transparency – in order to support and advance broadband rollout in Germany”. It further states that when regulating wholesale rates it has to be ensured that “… adequate incentives for network rollout are provided on the one hand, while sustainable and fair competition is ensured on the other”. Also an EC draft recommendation on regulated network access is about to set new standards for the regulation of next generation access networks. According to the recommendation the prices of new assets shall be based on costs plus a projectspecific risk premium to be included in the costs of capital for the investment risk incurred by the operator. This approach has been criticised from various sides. In particular it has been questioned whether such an approach is adequate to meet the objectives of encouraging both competition and investment into next generation access networks. Against this background, the concept of “long term risk sharing contracts” has been proposed recently as an approach which does not only incorporate the various additional risks involved in the deployment of NGA infrastructure, but has several other advantages. This paper will demonstrate that the concept allows for competition to evolve at both the retail and wholesale level on fair, objective, non-discriminatory and transparent terms and conditions. Moreover, it ensures the highest possible investment incentive in line with socially desirable outcome. The paper is organised as follows: The next section will briefly outline the importance of encouraging competition and investment in an NGA-environment. The third section will specify the design of long term risk sharing contracts in view of achieving these objectives. The fourth section will examine potential problems associated with the concept. In doing so a way of how to deal with them will be elaborated. The last section will look at arguments against long term risk sharing contracts. It will be shown that these arguments are not strong enough to build a case against introducing such contracts.
To give the exchange of goods and services between the European Union (EU) and the United States (U.S.) new momentum the two parties are currently negotiating the transatlantic free trade agreement Transatlantic Trade and Investment Partnership (TTIP). The aim is to create the largest free trade area in the world. The agreement, once entered into force, will oblige EU countries and the U.S. to further liberalize their markets.
The negotiations on TTIP include a chapter on Electronic Communications/ Telecommunications. The challenge therein will be securing commitments for market access to Electronic Communications services. At the same time, these commitments must reflect the legitimate need for consumer protection issues. The need to reduce Electronic Communications-related non-tariff barriers to trade between the Parties is due to the fact that these markets are heavily regulated. Without transnational rules as to regulations national governments can abuse these regulations to deter the market entry by new (foreign) suppliers. Thus the free trade agreement TTIP affects in many respects regulatory provisions on and access to Electronic Communications markets. The objective of this paper is therefore to examine to what extend the regulatory principles for Electronic Communications markets envisaged under TTIP will result in trade facilitation and regulatory convergence between the EU and the U.S.
As to this question the result of the analysis is that the chapter on Electronic Communications will be an important step towards facilitating trade in Electronic Communications services. At the same time some regulatory convergence will take place, but this convergence will not lead to a (full) harmonization of regulations. Rather the norm, also after TTIP negotiations will have been concluded successfully, will be mutual recognition of different regulatory regimes. Different regulations being the optimal policy response in different market settings will continue to exist. Moreover, it is very unlikely that such regulatory principles for the Electronic Communications sector are a vehicle for a race to the bottom in levels of consumer protection.
The main objective of our ROS Summer School series is to introduce MA level students to program mobile robots with the Robot Operating System (ROS). ROS is a robot middleware that is used my many research institutions world-wide. Therefore, many state-of-the-art algorithms of mobile robotics are available in ROS and can be deployed very easily. As a basic robot platform we deploy a 1/10 RC cart that is wquipped with an Arduino micro-controller to control the servo motors, and an embedded PC that runs ROS. In two weeks, participants get to learn the basics of mobile robotics hands-on. We describe our teaching concepts and our curriculum and report on the learning success of our students.