Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (241)
- Fachbereich Medizintechnik und Technomathematik (218)
- Fachbereich Luft- und Raumfahrttechnik (189)
- Fachbereich Energietechnik (181)
- IfB - Institut für Bioengineering (151)
- Solar-Institut Jülich (110)
- Fachbereich Maschinenbau und Mechatronik (108)
- Fachbereich Bauingenieurwesen (75)
- Fachbereich Wirtschaftswissenschaften (57)
- ECSM European Center for Sustainable Mobility (53)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (47)
- INB - Institut für Nano- und Biotechnologien (39)
- Fachbereich Chemie und Biotechnologie (24)
- Kommission für Forschung und Entwicklung (17)
- Nowum-Energy (11)
- Fachbereich Architektur (9)
- Fachbereich Gestaltung (4)
- IaAM - Institut für angewandte Automation und Mechatronik (3)
- Arbeitsstelle fuer Hochschuldidaktik und Studienberatung (2)
- Institut fuer Angewandte Polymerchemie (2)
- ZHQ - Bereich Hochschuldidaktik und Evaluation (2)
- Digitalisierung in Studium & Lehre (1)
- Freshman Institute (1)
- Kommission für Planung und Finanzen (1)
- Senat (1)
Language
- English (1197) (remove)
Document Type
- Conference Proceeding (1197) (remove)
Keywords
- Biosensor (25)
- CAD (7)
- Finite-Elemente-Methode (7)
- civil engineering (7)
- Bauingenieurwesen (6)
- Blitzschutz (6)
- Enterprise Architecture (5)
- Clusterion (4)
- Energy storage (4)
- Gamification (4)
Numerical models have become an essential part of snow avalanche engineering. Recent
advances in understanding the rheology of flowing snow and the mechanics of entrainment and
deposition have made numerical models more reliable. Coupled with field observations and historical
records, they are especially helpful in understanding avalanche flow in complex terrain. However, the
application of numerical models poses several new challenges to avalanche engineers. A detailed
understanding of the avalanche phenomena is required to specify initial conditions (release zone
dimensions and snowcover entrainment rates) as well as the friction parameters, which are no longer
based on empirical back-calculations, rather terrain roughness, vegetation and snow properties. In this
paper we discuss these problems by presenting the computer model RAMMS, which was specially
designed by the SLF as a practical tool for avalanche engineers. RAMMS solves the depth-averaged
equations governing avalanche flow with first and second-order numerical solution schemes. A
tremendous effort has been invested in the implementation of advanced input and output features.
Simulation results are therefore clearly and easily visualized to simplify their interpretation. More
importantly, RAMMS has been applied to a series of well-documented avalanches to gauge model
performance. In this paper we present the governing differential equations, highlight some of the input
and output features of RAMMS and then discuss the simulation of the Gatschiefer avalanche that
occurred in April 2008, near Klosters/Monbiel, Switzerland.
The Volatility Framework is a collection of tools for the analysis of computer RAM. The framework offers a multitude of analysis options and is used by many investigators worldwide. Volatility currently comes with a command line interface only, which might be a hinderer for some investigators to use the tool. In this paper we present a GUI and extensions for the Volatility Framework, which on the one hand simplify the usage of the tool and on the other hand offer additional functionality like storage of results in a database, shortcuts for long Volatility Framework command sequences, and entirely new commands based on correlation of data stored in the database.
An increasing number of applications target their executions on specific hardware like general purpose Graphics Processing Units. Some Cloud Computing providers offer this specific hardware so that organizations can rent such resources. However, outsourcing the whole application to the Cloud causes avoidable costs if only some parts of the application benefit from the specific expensive hardware. A partial execution of applications in the Cloud is a tradeoff between costs and efficiency. This paper addresses the demand for a consistent framework that allows for a mixture of on- and off-premise calculations by migrating only specific parts to a Cloud. It uses the concept of workflows to present how individual workflow tasks can be migrated to the Cloud whereas the remaining tasks are executed on-premise.
With the final objective of optimizing the "Micromix" hydrogen combustion principle, a round jet in a laminar cross-flow prior to its combustion is investigated experimentally using Stereoscopic Particle Image Velocimetry. Measurements are performed at a jet to cross-stream momentum ratio of 1 and a Reynolds number, based on the jet diameter and jet velocity, of 1600. The suitability to combine side, top and end views is analyzed statistically. The statistical theory of testing hypotheses, pertaining to the joint distribution of the averaged velocity along intersecting observation planes, is employed. Overall, the averaged velocity fields of the varying observation planes feature homogeneity at a 0.05 significance level. Minor discrepancies are related to the given experimental conditions. By use of image maps, averaged and instantaneous velocity fields, an attempt is made to elucidate the flow physics and a kinematically consistent vortex model is proposed. In the time-averaged flow field, the principal vortical systems were identified and the associated mixing visualized. The jet trajectory and physical dimensions scale with the momentum ratio times the jet diameter. The jet/cross-flow mixture converging upon the span-wise centre-line, the lifting action of the Counter Rotating Vortex Pair and the reversed flow region contribute to the high entrainment and mixedness. It is shown that the jet width is larger on the downstream side as compared to the upstream side of the centre-streamline. The deepest penetration of the particles on the outer boundary occurs in the centre-plane. Meanwhile, with increasing off-centre position, the boundaries all lay further from the centre-line position than does the boundary in the centre-plane, corresponding to a kidney-like shape of the flow cross-section. The generation of the Counter Rotating Vortex Pair and the instability mechanism is documented by instantaneous image maps and vector fields. The necessary circulation for the Counter Rotating Vortex Pair originates from a combined effect of steady in-hole, hanging and wake vortices. The strong cross-flow and jet interaction induces a three-dimensional waving, the stream-wise Counter Rotating Vortex Pair pair, leading to the formation of Ring Like Vortices. A secondary Counter Rotating Vortex Pair forms on top of the primary Counter Rotating Vortex Pair, resulting in mixing by "puffs". Overall, Stereoscopic Particle Image Velocimetry proofed capable of elucidating the Jet in Cross-Flow complex flow field. The gained insight in the mixing process will definitely contribute to the "Micromix" hydrogen combustion optimization.
For more than a decade up to now there is an ongoing interest in small gas turbines downsized to micro-scale. With their high energy density they offer a great potential as a substitute for today’s unwieldy accumulators, found in a variety of applications like laptops, small tools etc. But micro-scale gas turbines could not only be used for generating electricity, they could also produce thrust for powering small unmanned aerial vehicles (UAVs) or similar devices. Beneath all the great design challenges with the rotating parts of the turbomachinery at this small scale, another crucial item is in fact the combustion chamber needed for a safe and reliable operation. With the so called regular micromix burning principle for hydrogen successfully downscaled in an initial combustion chamber prototype of 10 kW energy output, this paper describes a new design attempt aimed at the integration possibilities in a μ-scale gas turbine. For manufacturing the combustion chamber completely out of stainless steel components, a recuperative wall cooling was introduced to keep the temperatures in an acceptable range. Also a new way of an integrated ignition was developed. The detailed description of the prototype’s design is followed by an in depth report about the test results. The experimental investigations comprise a set of mass flow variations, coupled with a variation of the equivalence ratio for each mass flow at different inlet temperatures and pressures. With the data obtained by an exhaust gas analysis, a full characterisation concerning combustion efficiency and stability of the prototype chamber is possible. Furthermore the data show a full compliance with the expected operating requirements of the designated μ-scale gas turbine.
Among many approaches to address the high-level decision making problem for autonomous robots and agents, the robot program¬ming and plan language Golog follows a logic-based deliberative approach, and its successors were successfully deployed in a number of robotics applications over the past ten years. Usually, Golog interpreter are implemented in Prolog, which is not available for our target plat¬form, the bi-ped robot platform Nao. In this paper we sketch our first approach towards a prototype implementation of a Golog interpreter in the scripting language Lua. With the example of the elevator domain we discuss how the basic action theory is specified and how we implemented fluent regression in Lua. One possible advantage of the availability of a Non-Prolog implementation of Golog could be that Golog becomes avail¬able on a larger number of platforms, and also becomes more attractive for roboticists outside the Cognitive Robotics community.
In order to allow an autonomous robot to perform non-trivial tasks like to explore a foreign planet the robot has to have deliberative capabilities like reasoning or planning. Logic-based approaches like the programming and planing language Golog and it successors has been successfully used for such decision-making problems. A drawback of this particular programing language is that their interpreter usually are written in Prolog and run on a Prolog back-end. Such back-ends are usually not available or feasible on resource-limited robot systems. In this paper we present our ideas and first results of a re-implementation of the interpreter based on the Lua scripting language which is available on a wide range of systems including small embedded systems.
We propose a formalism for reasoning about actions based on multi-modal logic which allows for expressing observations as first-class objects. We introduce a new modal operator, namely [o |α], which allows us to capture the notion of perceiving an observation given that an action has taken place. Formulae of the type [o |α]ϕ mean ’after perceiving observation o, given α was performed, necessarily ϕ’. In this paper, we focus on the challenges concerning sensing with explicit observations, and acting with nondeterministic effects. We present the syntax and semantics, and a correct and decidable tableau calculus for the logic
The idea of component-based software engineering was proposed more that 40 years ago, yet only few robotics software frameworks follow these ideas. The main problem with robotics software usually is that it runs on a particular platform and transferring source code to another platform is crucial. In this paper, we present our software framework Fawkes which follows the component-based software design paradigm by featuring a clear component concept with well-defined communication interfaces. We deployed Fawkes on several different robot platforms ranging from service robots to biped soccer robots. Following the component concept with clearly defined communication interfaces shows great benefit when porting robot software from one robot to the other. Fawkes comes with a number of useful plugins for tasks like timing, logging, data visualization, software configuration, and even high-level decision making. These make it particularly easy to create and to debug productive code, shortening the typical development cycle for robot software.
The high-level decision making process of an autonomous robot can be seen as an hierarchically organised entity, where strategical decisions are made on the topmost layer, while the bottom layer serves as driver for the hardware. In between is a layer with monitoring and reporting functionality. In this paper we propose a behaviour engine for this middle layer which, based on formalism of hybrid state machines (HSMs), bridges the gap between high-level strategic decision making and low-level actuator control. The behaviour engine has to execute and monitor behaviours and reports status information back to the higher level. To be able to call the behaviours or skills hierarchically, we extend the model of HSMs with dependencies and sub-skills. These Skill-HSMs are implemented in the lightweight but expressive Lua scripting language which is well-suited to implement the behaviour engine on our target platform, the humanoid robot Nao.
The Solar-Institute Jülich (SIJ) has initiated the construction of the first and only German solar tower power plant and is now involved in the accompanying research. The power plant for experimental and demonstration purposes in the town of Jülich started supplying electric energy in the beginning of 2008. The central receiver plant features as central innovation an open volumetric receiver, consisting of porous ceramic elements that simultaneously absorb the concentrated sunlight and transfer the heat to ambient air passing through the pores so that an average temperature of 680°C is reached. The subsequent steam cycle generates up to 1.5 MWe. A main field of research at the SIJ is the optimization of the absorber structures. To analyze the capability of new absorber specimens a special test facility was developed and set up in the laboratory. A high-performance near-infrared radiator offers for single test samples a variable and repeatable beam with a power of up to 320 kW/m² peak. The temperatures achieved on the absorber surface can reach more than 1000°C. To suck ambient air through the open absorber - like on the tower - it is mounted on a special blower system. An overview about the test facility and some recent results will be presented.
The importance of validating and reproducing the outcome of computational processes is fundamental to many application domains. Assuring the provenance of workflows will likely become even more important with respect to the incorporation of human tasks to standard workflows by emerging standards such as WS-HumanTask. This paper addresses this trend by an actor-based workflow approach that actively support provenance. It proposes a framework to track and store provenance information automatically that applies for various workflow management systems. In particular, the introduced provenance framework supports the documentation of workflows in a legally binding way. The authors therefore use the concept of layered XML documents, i.e. history-tracing XML. Furthermore, the proposed provenance framework enables the executors (actors) of a particular workflow task to attest their operations and the associated results by integrating digital XML signatures.
Lightning safety guidelines
(2010)
This paper introduces lightning to the layman, noting the right behaviour in front of thunderstorms as well as protective measures against lightning. It also contributes to the prevention of lightning injuries and damages. This report was prepared by the authors inside the AHG1 Group for IEC TC81 (Lightning Protection).
Large power plants can be endangered by lightning strikes with possible consequences regarding their safety and availability. A special scenario is a lightning strike to the HV overhead transmission line close to the power plant's connection to the power grid. If then additionally a so-called shielding failure of the overhead ground wire on top of the overhead transmission line is assumed, i.e. the lightning strikes directly into a phase conductor, this is an extreme electromagnetic disturbance. The paper deals with the numerical simulation of such a lightning strike and the consequences on the components of the power plant's auxiliary power network connected to different voltage levels.
Planning the air-terminations for a structure to be protected the use of the rolling-sphere method (electro-geometrical model) is the best way from the physics of lightning point-of-view. Therefore, international standards prefer this method. However, using the rolling-sphere method only results in possible point-of-strikes on a structure without giving information about the probability of strikes at the individual points compared to others.
Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler. This approach has various limitations, particularly if human interactions should be integrated in workflow execution. To support human interactions with the benefit of enabling inter organizational computation and community approaches, this poster paper proposes the idea of a pull-based task distribution strategy. Here, heterogeneous resources, including human interaction, should actively select tasks for execution from a central repository. This leads to special demands regarding security issues like access control. In the established push-based job execution the resources are responsible for granting access to workflows and job initiators. In general this is done by access control lists, where users are explicitly mapped to local accounts according to their policies. In the pull-based approach the resources actively apply for job executions by sending requests to a central task repository. This means that every resource has to be able to authenticate against the repository to be authorized for task execution. In other words the authorization is relocated from the resources to the repository. The poster paper introduces current work regarding to the mentioned security aspects in the pull-based approach within the scope of the project “HiX4AGWS”.
In steps of the production chain of carbide inserts, such as unloading or packaging, the conformity test of the insert type is done manually, which causes a statistic increase of errors due to monotony and fatigue of the worker and the wide variety of the insert types. A machine vision system is introduced that captures digital frames of the inserts in the production line, analyses inspects automatically and measures four quality features: coating colour, edge radius, plate shape and chip-former geometry. This new method has been tested on several inserts of different types and has shown that the prevalent insert types can be inspected and robustly classified in real production environment and therefore improves the manufacturing automation.
Shakedown analysis of two dimensional structures by an edge-based smoothed finite element method
(2010)
Production and distribution of personalized information services employing mass customization
(2003)
Recently, the SHARP Corporation, Japan, has developed the world’s first "Plasma Cluster Ions (PCI)" air purification technology using plasma discharge to generate cluster ions. The new plasma cluster device releases positive and negative ions into the air, which are able to decompose and deactivate harmful airborne substances by chemical reactions. Because cluster ions consist of positive and negative ions that normally exist in the natural world, they are completely harmless and safe to humans. The amount of ozone generated by cluster ions is less than 0.01 ppm, which is significantly less than the 0.05-ppm standard for industrial operations and consumer electronics. This amount, thus, has no harming effects whatsoever on the human body. But particular properties and chemical processes in PCI treatment are still under study. It has been shown that PCI in most cases show strongly pronounced irreversible killing effects in respect of airborne microflora due to free-radical induced reactions and can be considered as a potent technology to disinfect both home, medical and industrial appliances.
Summary and Conclusions PCIs were clearly effective in terms of their antibacterial effects with the strains tested. This efficacy increased with the time the bacteries were exposed to PCIs. The bactericidal action has proved to be irreversible. PCIs were significantly less effective in shadowed areas. PCI exposure caused multiple protein damages as observed in SDS PAGE studies. There was no single but multiple molecular mechanism causing the bacterial death.
Recently, SHARP corporation has developed the world’s first “Plasma Cluster Ions (PCI)” air purification technology, which uses plasma discharge to generate cluster ions. The new plasma cluster device releases into the air positive and negative ions, which are harmless to humans and are able to decompose and deactivate airborne substances by chemical reactions. A lot of phenomenological tests of the PCI air purification technology on microbial cells have been conducted. And, in most cases, it has been shown that PCI demonstrate strongly pronounced killing effect. Although, the particular mechanisms of PCI action are still not evident. We studied variations in resistance to PCI among gram-positive airborne microorganisms, as well as some dose-dependent, spatial, cultural and biochemical properties of PCI action in respect of Staphylococcus spp, Enterococcus spp, Micrococcus spp.
Recently, SHARP corporation has developed the world’s first "Plasma Cluster Ions® (PCI)" air purification technology, which uses plasma discharge to generate cluster ions. The new Plasma Cluster Device releases positive and negative ions into the air, which are harmless to humans and are able to decompose and deactivate airborne substances by chemical reactions. In the past, phenomenological tests on the efficacy of the PCI air purification technology on microbial cells have been conducted. In most cases, it has been shown that PCI demonstrated strongly pronounced killing effects on microorganisms. However, the particular mechanisms of PCI action still have to be uncovered.
Mechanical stimulation of the cells resulted in evident changes in the cell morphology, protein composition and gene expression. Microscopically, additional formation of stress fibers accompanied by cell re-arrangements in a monolayer was observed. Also, significant activation of p53 gene was revealed as compared to control. Interestingly, the use of CellTech membrane coating induced cell death after mechanical stress had been applied. Such an effect was not detected when fibronectin had been used as an adhesion substrate.
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
A melting probe equipped with autofluorescence-based detection system combined with a light scattering unit, and, optionally, with a microarray chip would be ideally suited to probe icy environments like Europa’s ice layer as well as the polar ice layers of Earth and Mars for recent and extinct live.
As a deduction from these results, we can conclude that proteins mainly in vitro, denaturate totally at a temperature between 57°C -62°C, and they also affected by NO and different ions types. In which mainly, NO cause earlier protein denaturation, which means that, NO has a destabilizing effect on proteins, and also different ions will alter the protein denaturation in which, some ions will cause earlier protein denaturation while others not.
The absence of a general method for endotoxin removal from liquid interfaces gives an opportunity to find new methods and materials to overcome this gap. Activated nanostructured carbon is a promising material that showed good adsorption properties due to its vast pore network and high surface area. The aim of this study is to find the adsorption rates for a carboneous material produced at different temperatures, as well as to reveal possible differences between the performance of the material for each of the adsorbates used during the study (hemoglobin, serum albumin and lipopolysaccharide, LPS).
One of interesting but not well known water properties is related to appearance of highly ordered structures in response to strong electrical field. In 1893 Sir William Armstrong placed a cotton thread between two wine glasses filled with chemically pure water. When high DC voltage was applied between the glasses, a connection consisting of water formed, producing a "water bridge"
We present the novel concept of a combined drilling and melting probe for subsurface ice research. This probe, named “IceMole”, is currently developed, built, and tested at the FH Aachen University of Applied Sciences’ Astronautical Laboratory. Here, we describe its first prototype design and report the results of its field tests on the Swiss Morteratsch glacier. Although the IceMole design is currently adapted to terrestrial glaciers and ice shields, it may later be modified for the subsurface in-situ investigation of extraterrestrial ice, e.g., on Mars, Europa, and Enceladus. If life exists on those bodies, it may be present in the ice (as life can also be found in the deep ice of Earth).
Tests with palm tree leaves have just started yet and scan data are in the process to be analyzed. The final goal of future project for palm tree gender and species recognition will be to develop optical scanning technology to be applied to date palm tree leaves for in–situ screening purposes. Depending on the software used and the particular requirements of the users the technology potentially shall be able to identify palm tree diseases, palm tree gender, and species of young date palm trees by scanning leaves.
The ”IceMole“ is a novel maneuverable subsurface ice probe for clean in-situ analysis and sampling of subsurface ice and subglacial water/brine. It is developed and build at FH Aachen University of Applied Sciences’ Astronautical Laboratory. A first prototype was successfully tested on the Swiss Morteratsch glacier in 2010. Clean sampling is achieved with a hollow ice screw (as it is used in mountaineering) at the tip of the probe. Maneuverability is achieved with a differentially heated melting head. Funded by the German Space Agency (DLR), a consortium led by FH Aachen currently develops a much more advanced IceMole probe, which includes a sophisticated system for obstacle avoidance, target detection, and navigation in the ice. We intend to use this probe for taking clean samples of subglacial brine at the Blood Falls (McMurdo Dry Valleys, East Antarctica) for chemical and microbiological analysis. In our conference contribution, we 1) describe the IceMole design, 2) report the results of the field tests of the first prototype on the Morteratsch glacier, 3) discuss the probe’s potential for the clean in-situ analysis and sampling of subsurface ice and subglacial liquids, and 4) outline the way ahead in the development of this technology.
Proceedings of the 2nd Humboldt Kolleg, Hammamet, Tunisia Organizer: Alexander von Humboldt Stiftung, Germany. pdf 184 p. Welcome Address Dear Participants, Welcome to the 2nd Humboldt Kolleg in “Nanoscale Science and Technology” (NS&T’12) in Tunisia, sponsored by the "Alexander von Humboldt" foundation. The NS&T’12 multidisciplinary scientific program includes seven "hot" topics dealing with "Nanoscale Science and Technology" covering basic and application-oriented research as well as industrial (market) aspects: - Molecular Biophyics, Spectroscopy Techniques, Imaging Microscopy - Nanomaterials Synthesis for Medicine and Bio-chemical Sensors - Nanostructures, Semiconductors, Photonics and Nanodevices - New Technologies in Market Industry - Environment, Electro-chemistry, Bio-polymers and Fuel Cells - Nanomaterials, Photovoltaic, Modelling, Quantum Physics - Microelectronics, Sensors Networks and Embedded Systems We are deeply indebted to all members of the Scientific Committee and General Chairs for joint Sessions and to all speakers and chairmen, who have dedicated invaluable time and efforts for the realization of this event. On behalf of the Organizing Committee, we are cordially inviting you to join the conference and hope that your stay will be fruitful, rewarding and enjoyable. Prof. Dr. Michael J. Schöning, Prof. Dr. Adnane Abdelghani
Summary: This paper presents a methodology to study and understand the mechanics of stapled anastomotic behaviors by combining empirical experimentation and finite element analysis. Performance of stapled anastomosis is studied in terms of leakage and numerical results which are compared to in vitro experiments performed on fresh porcine tissue. Results suggest that leaks occur between the tissue and staple legs penetrating through the tissue.
In: Proc. of the 11th Intl. Conf. on Computing in Civil and Building Engineering (ICCCBE-XI) ed. Hugues Rivard, Montreal, Canada, Seite 1-12, ACSE (CD-ROM), 2006 Currently, the conceptual design phase is not adequately supported by any CAD tool. Neither the support while elaborating conceptual sketches, nor the automatic proof of correctness with respect to effective restrictions is currently provided by any commercial tool. To enable domain experts to store the common as well as their personal domain knowledge, we develop a visual language for knowledge formalization. In this paper, a major extension to the already existing concepts is introduced. The possibility to define rule dependencies extends the expressiveness of the knowledge definition language and contributes to the usability of our approach.
Proc. of the 2005 ASCE Intl. Conf. on Computing in Civil Engineering (ICCC 2005) eds. L. Soibelman und F. Pena-Mora, Seite 1-14, ASCE (CD-ROM), Cancun, Mexico, 2005 Current CAD tools are not able to support the fundamental conceptual design phase, and none of them provides consistency analyses of sketches produced by architects. To give architects a greater support at the conceptual design phase, we develop a CAD tool for conceptual design and a knowledge specification tool allowing the definition of conceptually relevant knowledge. The knowledge is specific to one class of buildings and can be reused. Based on a dynamic knowledge model, different types of design rules formalize the knowledge in a graph-based realization. An expressive visual language provides a user-friendly, human readable representation. Finally, consistency analyses enable conceptual designs to be checked against this defined knowledge. In this paper we concentrate on the knowledge specification part of our project.
In: Computer Aided Architectural Design Futures 2005 2005, Part 4, 207-216, DOI: http://dx.doi.org/10.1007/1-4020-3698-1_19 The conceptual design at the beginning of the building construction process is essential for the success of a building project. Even if some CAD tools allow elaborating conceptual sketches, they rather focus on the shape of the building elements and not on their functionality. We introduce semantic roomobjects and roomlinks, by way of example to the CAD tool ArchiCAD. These extensions provide a basis for specifying the organisation and functionality of a building and free architects being forced to directly produce detailed constructive sketches. Furthermore, we introduce consistency analyses of the conceptual sketch, based on an ontology containing conceptual relevant knowledge, specific to one class of buildings.
IASSE-2004 - 13th International Conference on Intelligent and Adaptive Systems and Software Engineering eds. W. Dosch, N. Debnath, pp. 245-250, ISCA, Cary, NC, 1-3 July 2004, Nice, France We introduce a UML-based model for conceptual design support in civil engineering. Therefore, we identify required extensions to standard UML. Class diagrams are used for elaborating building typespecific knowledge: Object diagrams, implicitly contained in the architect’s sketch, are validated against the defined knowledge. To enable the use of industrial, domain-specific tools, we provide an integrated conceptual design extension. The developed tool support is based on graph rewriting. With our approach architects are enabled to deal with semantic objects during early design phase, assisted by incremental consistency checks.
In: Net-distributed Co-operation : Xth International Conference on Computing in Civil and Building Engineering, Weimar, June 02 - 04, 2004 ; proceedings / [ed. by Karl Beuke ...] . - Weimar: Bauhaus-Univ. Weimar 2004. - 1. Aufl. . Seite 1-14 ISBN 3-86068-213-X International Conference on Computing in Civil and Building Engineering <10, 2004, Weimar> Summary In our project, we develop new tools for the conceptual design phase. During conceptual design, the coarse functionality and organization of a building is more important than a detailed worked out construction. We identify two roles, first the knowledge engineer who is responsible for knowledge definition and maintenance; second the architect who elaborates the conceptual de-sign. The tool for the knowledge engineer is based on graph technology, it is specified using PROGRES and the UPGRADE framework. The tools for the architect are integrated to the in-dustrial CAD tool ArchiCAD. Consistency between knowledge and conceptual design is en-sured by the constraint checker, another extension to ArchiCAD.
ITCE-2003 - 4th Joint Symposium on Information Technology in Civil Engineering ed Flood, I., Seite 1-12, ASCE (CD-ROM), Nashville, USA In this paper we discussed graph based tools to support architects during the conceptual design phase. Conceptual Design is defined before constructive design; the used concepts are more abstract. We develop two graph based approaches, a topdown using the graph rewriting system PROGRES and a more industrially oriented approach, where we extend the CAD system ArchiCAD. In both approaches, knowledge can be defined by a knowledge engineer, in the top-down approach in the domain model graph, in the bottom-up approach in the in an XML file. The defined knowledge is used to incrementally check the sketch and to inform the architect about violations of the defined knowledge. Our goal is to discover design error as soon as possible and to support the architect to design buildings with consideration of conceptual knowledge.
Applications of Graph Transformations with Industrial Relevance Lecture Notes in Computer Science, 2004, Volume 3062/2004, 434-439, DOI: http://dx.doi.org/10.1007/978-3-540-25959-6_33 This paper gives a brief overview of the tools we have developed to support conceptual design in civil engineering. Based on the UPGRADE framework, two applications, one for the knowledge engineer and another for architects allow to store domain specific knowledge and to use this knowledge during conceptual design. Consistency analyses check the design against the defined knowledge and inform the architect if rules are violated.
In: Advances in intelligent computing in engineering : proceedings of the 9.International EG-ICE Workshop ; Darmstadt, (01 - 03 August) 2002 / Martina Schnellenbach-Held ... (eds.) . - Düsseldorf: VDI-Verl., 2002 .- Fortschritt-Berichte VDI, Reihe 4, Bauingenieurwesen ; 180 ; S. 1-35 The paper describes a novel way to support conceptual design in civil engineering. The designer uses semantical tools guaranteeing certain internal structures of the design result but also the fulfillment of various constraints. Two different approaches and corresponding tools are discussed: (a) Visually specified tools with automatic code generation to determine a design structure as well as fixing various constraints a design has to obey. These tools are also valuable for design knowledge specialist. (b) Extensions of existing CAD tools to provide semantical knowledge to be used by an architect. It is sketched how these different tools can be combined in the future. The main part of the paper discusses the concepts and realization of two prototypes following the two above approaches. The paper especially discusses that specific graphs and the specification of their structure are useful for both tool realization projects.
Working paper distributed at 2nd Annual Next Generation Telecommunications Conference 2009, 13th – 14th October 2009, Brussels 14 pages Abstract Governments all over Europe are in the process of adopting new broadband strategies. The objective is to create modern telecommunications networks based on powerful broadband infrastructures". In doing so, they aim for innovative and investment-friendly concepts. For instance, in a recently published consultation paper on the subject the German regulator BNetzA declared that it will take “greater account of … reducing risks, securing the investment and innovation power, providing planning certainty and transparency – in order to support and advance broadband rollout in Germany”. It further states that when regulating wholesale rates it has to be ensured that “… adequate incentives for network rollout are provided on the one hand, while sustainable and fair competition is ensured on the other”. Also an EC draft recommendation on regulated network access is about to set new standards for the regulation of next generation access networks. According to the recommendation the prices of new assets shall be based on costs plus a projectspecific risk premium to be included in the costs of capital for the investment risk incurred by the operator. This approach has been criticised from various sides. In particular it has been questioned whether such an approach is adequate to meet the objectives of encouraging both competition and investment into next generation access networks. Against this background, the concept of “long term risk sharing contracts” has been proposed recently as an approach which does not only incorporate the various additional risks involved in the deployment of NGA infrastructure, but has several other advantages. This paper will demonstrate that the concept allows for competition to evolve at both the retail and wholesale level on fair, objective, non-discriminatory and transparent terms and conditions. Moreover, it ensures the highest possible investment incentive in line with socially desirable outcome. The paper is organised as follows: The next section will briefly outline the importance of encouraging competition and investment in an NGA-environment. The third section will specify the design of long term risk sharing contracts in view of achieving these objectives. The fourth section will examine potential problems associated with the concept. In doing so a way of how to deal with them will be elaborated. The last section will look at arguments against long term risk sharing contracts. It will be shown that these arguments are not strong enough to build a case against introducing such contracts.
The ANM’09 multi-disciplinary scientific program includes topics in the fields of "Nanotechnology and Microelectronics" ranging from "Bio/Micro/Nano Materials and Interfacing" aspects, "Chemical and Bio-Sensors", "Magnetic and Superconducting Devices", "MEMS and Microfluidics" over "Theoretical Aspects, Methods and Modelling" up to the important bridging "Academics meet Industry".
The propagation of mechanical waves in plates of isotropic elastic material is investigated. After a short introduction to the understanding of focussing of stress waves in a plate with a curved boundary the method of characteristics is applied to a plate of hyperelastic material. Using this method the propagation of acceleration waves is discussed. Based on this a numerical difference scheme is developed for solving initial-boundary-value problems and applied to two examples: propagation of a point disturbance in a homogeneously finitely strained non-linear elastic plate and geometrical focussing in al linear elastic plate.
In the presented paper data collected from the field related to damage statistics of electrical and electronic apparatus in household are reported and investigated. These damages (total number approx. 74000 cases), registered by five German insurance companies in 2005 and 2006, were adviced by customers as caused by lightning overvoltages. With the use of stochastical methods it is possible, to reasses the collected data and to distinguish between cases, which are with high probability caused by lightning overvoltages, and those, which are not. If there was an indication for a direct strike, this case was excluded, so the focus was only on indirect lightning flashes, i.e. only flashes to ground near the structure and flashes to or nearby an incoming service line were investigated. The data from the field contain the location of damaged apparatus (residence of the policy holder) and the distances of the nearest cloud-to-ground stroke to the location of the damage registered by the German lightning location network BLIDS at the date of damage. The statistical data along with some complementary numerical simulations allow to verify the correspondence of the Standards rules used for IEC 62305-2 with the field data and to define some correction needs. The results could lead to a better understanding whether a damage reported to an insurance company is really caused by indirect lightning, or not.
On 1st January 1998, the German telecom market was fully liberalised. Since then genuine competition between market participants has developed, based on a comprehensive legal and regulatory framework that provides for safeguards against unfair competition and market power by Deutsche Telekom. Today, about 10 years after the liberalisation of the telecommunications sector a revision of this regulatory approach has become necessary because at least on three dimensions the situation is quite different from the one 10 years ago: First, with numerous established alternative operators in the market monopolies have been successfully challenged and competition introduced. Second, not only is Cable TV becoming in large parts of Germany a viable alternative for the provision of broadband services but also mobile services are becoming increasingly a substitute for fixed services. Last but not least there are important technological changes under way, requiring huge investments in infrastructure upgrades for next generation networks. In the light of these new developments the question is to which extent the current regulatory approach of severe ex-ante regulatory intervention is still appropriate. Is any part of the network of the former incumbent still a bottleneck? A more light handed regulatory approach might be the right response to this new situation. The paper is organised as follows: The first section will briefly examine the economic rationale for regulating network access. Based on the assumption that regulation is always necessary when bottlenecks exist regulatory principles for an efficient network access regime will be derived. The second section compares the situation of the German market in early 1998 with the one of today. Thereby three dimensions will be considered: the degree of competition, the potential for substitution and technological developments. The third section will define some requirements for the future regulation of telecom markets. Proposals will be elaborated how to ensure competitive telecom markets in the light of new economic and technological challenges.
RaWid was the German national technology programme on transonic aerodynamics and supporting technologies, lasting from 1995 to 1998. One of the main topics was laminar wing development. Besides aerodynamic design work, many operational aspects were investigated. A manufacturing concept was developed to be applied to operational laminar wings and empennages. It was built in a large scale manufacturing demonstrator with the aerodynamic shape of a 1,5 m section of the A320 fin nose. Tolerances in shape and roughness fulfilled all requirements. The construction can easily be adapted to varying stiffness and strength requirements. Weight and manufacturing costs are comparable to common nose designs. The mock-up to be designed in ALTTA is based on this manufacturing principle. Another critical point is contamination of suction surfaces. Several tests were performed to investigate perforated titanium suction surfaces at realistic operational conditions: - a one year flight test with a suction plate in the stagnation area of the Airbus "Beluga" - a one year test of several suction plates in a ground test near the airport - a one year test of a working suction ground test installation at all weather conditions. No critical results were found. There is no long term suction degradation visible. Icing conditions and ground de-icing fluids used on airports did not pose severe problems. Some problems detected require only respection of weak design constraints.
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
Limit and shakedown theorems are exact theories of classical plasticity for the direct computation of safety factors or of the load carrying capacity under constant and varying loads. Simple versions of limit and shakedown analysis are the basis of all design codes for pressure vessels and pipings. Using Finite Element Methods more realistic modeling can be used for a more rational design. The methods can be extended to yield optimum plastic design. In this paper we present a first implementation in FE of limit and shakedown analyses for perfectly plastic material. Limit and shakedown analyses are done of a pipe–junction and a interaction diagram is calculated. The results are in good correspondence with the analytic solution we give in the appendix.
Abstracts of the ACHEMA 2000 - International Meeting on Chemical Engineering, Environmental Protection and Biotechnology, May 22 - 27, 2000. Frankfurt am Main. Achema 2000 : special edition / Linde. [Ed.: Linde AG. Red.: Volker R. Leski]. - Wiesbaden : Linde AG, 2000. - 56 p. : Ill., . - pp: 79 - 81
A procedure for the evaluation of the failure probability of elastic-plastic thin shell structures is presented. The procedure involves a deterministic limit and shakedown analysis for each probabilistic iteration which is based on the kinematical approach and the use the exact Ilyushin yield surface. Based on a direct definition of the limit state function, the non-linear problems may be efficiently solved by using the First and Second Order Reliabiblity Methods (Form/SORM). This direct approach reduces considerably the necessary knowledge of uncertain technological input data, computing costs and the numerical error. In: Computational plasticity / ed. by Eugenio Onate. Dordrecht: Springer 2007. VII, 265 S. (Computational Methods in Applied Sciences ; 7) (COMPLAS IX. Part 1 . International Center for Numerical Methods in Engineering (CIMNE)). ISBN 978-1-402-06576-7 S. 186-189
Market data for the German telecom market shows that Deutsche Telekom as the former incumbent is constantly loosing shares on all arkets for voice telephony: the market for local calls, the market for long-distance calls and the market for international calls. At the same time prices decline steadily with the latest trend being that operators offer voice services free of charge, the costs of which are covered by a monthly subscription charge. Against this background the paper examines the state of policy and regulatory reform in the telecommunications sector in Germany almost 10 years after the liberalisation of the fixed telecommunications market. Thereby the focus is on the analysis of the competitive conditions that have been established on the German market for voice telephony services. If these retail markets are competitive, there might be a need to remove remaining regulatory provisions. In the new environment of converging markets the future challenge of regulating fixed telecom markets might be to ensure that access to the network and/or services of a potentially dominant provider in a relevant market will satisfy requirements for openness and non-discrimination.